Jan 22 13:46:02 crc systemd[1]: Starting Kubernetes Kubelet... Jan 22 13:46:02 crc restorecon[4702]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:02 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 13:46:03 crc restorecon[4702]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 22 13:46:03 crc kubenswrapper[4743]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.570656 4743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573337 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573354 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573358 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573362 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573366 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573370 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573377 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573381 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573385 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573390 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573394 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573398 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573402 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573408 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573412 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573417 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573420 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573424 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573427 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573433 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573437 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573440 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573444 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573447 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573451 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573455 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573479 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573483 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573487 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573490 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573494 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573498 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573501 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573505 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573508 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573512 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573516 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573519 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573523 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573526 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573531 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573536 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573540 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573543 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573548 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573551 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573555 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573559 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573562 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573565 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573569 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573572 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573576 4743 feature_gate.go:330] unrecognized feature gate: Example Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573579 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573583 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573586 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573590 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573593 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573597 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573600 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573603 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573608 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573613 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573616 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573620 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573624 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573627 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573631 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573634 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573640 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.573644 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573839 4743 flags.go:64] FLAG: --address="0.0.0.0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573851 4743 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573859 4743 flags.go:64] FLAG: --anonymous-auth="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573865 4743 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573871 4743 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573877 4743 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573883 4743 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573889 4743 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573893 4743 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573898 4743 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573902 4743 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573906 4743 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573911 4743 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573914 4743 flags.go:64] FLAG: --cgroup-root="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573918 4743 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573922 4743 flags.go:64] FLAG: --client-ca-file="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573927 4743 flags.go:64] FLAG: --cloud-config="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573931 4743 flags.go:64] FLAG: --cloud-provider="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573935 4743 flags.go:64] FLAG: --cluster-dns="[]" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573940 4743 flags.go:64] FLAG: --cluster-domain="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573945 4743 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573949 4743 flags.go:64] FLAG: --config-dir="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573953 4743 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573957 4743 flags.go:64] FLAG: --container-log-max-files="5" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573963 4743 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573967 4743 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573971 4743 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573975 4743 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573979 4743 flags.go:64] FLAG: --contention-profiling="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573983 4743 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573987 4743 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573991 4743 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.573995 4743 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574001 4743 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574005 4743 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574009 4743 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574013 4743 flags.go:64] FLAG: --enable-load-reader="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574017 4743 flags.go:64] FLAG: --enable-server="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574022 4743 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574027 4743 flags.go:64] FLAG: --event-burst="100" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574032 4743 flags.go:64] FLAG: --event-qps="50" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574036 4743 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574041 4743 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574045 4743 flags.go:64] FLAG: --eviction-hard="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574051 4743 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574055 4743 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574059 4743 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574064 4743 flags.go:64] FLAG: --eviction-soft="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574068 4743 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574072 4743 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574076 4743 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574080 4743 flags.go:64] FLAG: --experimental-mounter-path="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574084 4743 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574088 4743 flags.go:64] FLAG: --fail-swap-on="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574092 4743 flags.go:64] FLAG: --feature-gates="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574100 4743 flags.go:64] FLAG: --file-check-frequency="20s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574104 4743 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574108 4743 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574112 4743 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574117 4743 flags.go:64] FLAG: --healthz-port="10248" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574121 4743 flags.go:64] FLAG: --help="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574125 4743 flags.go:64] FLAG: --hostname-override="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574129 4743 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574133 4743 flags.go:64] FLAG: --http-check-frequency="20s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574137 4743 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574141 4743 flags.go:64] FLAG: --image-credential-provider-config="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574145 4743 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574149 4743 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574153 4743 flags.go:64] FLAG: --image-service-endpoint="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574158 4743 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574162 4743 flags.go:64] FLAG: --kube-api-burst="100" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574165 4743 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574170 4743 flags.go:64] FLAG: --kube-api-qps="50" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574173 4743 flags.go:64] FLAG: --kube-reserved="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574177 4743 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574181 4743 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574185 4743 flags.go:64] FLAG: --kubelet-cgroups="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574190 4743 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574194 4743 flags.go:64] FLAG: --lock-file="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574198 4743 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574202 4743 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574206 4743 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574215 4743 flags.go:64] FLAG: --log-json-split-stream="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574219 4743 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574223 4743 flags.go:64] FLAG: --log-text-split-stream="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574227 4743 flags.go:64] FLAG: --logging-format="text" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574232 4743 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574236 4743 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574241 4743 flags.go:64] FLAG: --manifest-url="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574244 4743 flags.go:64] FLAG: --manifest-url-header="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574250 4743 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574254 4743 flags.go:64] FLAG: --max-open-files="1000000" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574259 4743 flags.go:64] FLAG: --max-pods="110" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574263 4743 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574267 4743 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574272 4743 flags.go:64] FLAG: --memory-manager-policy="None" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574276 4743 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574280 4743 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574284 4743 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574288 4743 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574297 4743 flags.go:64] FLAG: --node-status-max-images="50" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574301 4743 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574305 4743 flags.go:64] FLAG: --oom-score-adj="-999" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574309 4743 flags.go:64] FLAG: --pod-cidr="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574313 4743 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574319 4743 flags.go:64] FLAG: --pod-manifest-path="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574323 4743 flags.go:64] FLAG: --pod-max-pids="-1" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574327 4743 flags.go:64] FLAG: --pods-per-core="0" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574331 4743 flags.go:64] FLAG: --port="10250" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574335 4743 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574339 4743 flags.go:64] FLAG: --provider-id="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574343 4743 flags.go:64] FLAG: --qos-reserved="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574347 4743 flags.go:64] FLAG: --read-only-port="10255" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574352 4743 flags.go:64] FLAG: --register-node="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574356 4743 flags.go:64] FLAG: --register-schedulable="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574361 4743 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574368 4743 flags.go:64] FLAG: --registry-burst="10" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574372 4743 flags.go:64] FLAG: --registry-qps="5" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574376 4743 flags.go:64] FLAG: --reserved-cpus="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574380 4743 flags.go:64] FLAG: --reserved-memory="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574385 4743 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574389 4743 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574393 4743 flags.go:64] FLAG: --rotate-certificates="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574397 4743 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574401 4743 flags.go:64] FLAG: --runonce="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574405 4743 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574409 4743 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574413 4743 flags.go:64] FLAG: --seccomp-default="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574417 4743 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574421 4743 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574425 4743 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574430 4743 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574434 4743 flags.go:64] FLAG: --storage-driver-password="root" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574438 4743 flags.go:64] FLAG: --storage-driver-secure="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574442 4743 flags.go:64] FLAG: --storage-driver-table="stats" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574446 4743 flags.go:64] FLAG: --storage-driver-user="root" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574450 4743 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574457 4743 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574462 4743 flags.go:64] FLAG: --system-cgroups="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574466 4743 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574472 4743 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574476 4743 flags.go:64] FLAG: --tls-cert-file="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574480 4743 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574485 4743 flags.go:64] FLAG: --tls-min-version="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574489 4743 flags.go:64] FLAG: --tls-private-key-file="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574493 4743 flags.go:64] FLAG: --topology-manager-policy="none" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574497 4743 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574503 4743 flags.go:64] FLAG: --topology-manager-scope="container" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574507 4743 flags.go:64] FLAG: --v="2" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574513 4743 flags.go:64] FLAG: --version="false" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574519 4743 flags.go:64] FLAG: --vmodule="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574524 4743 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574528 4743 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574623 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574627 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574631 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574635 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574638 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574642 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574646 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574650 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574655 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574659 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574663 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574667 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574671 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574674 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574678 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574682 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574687 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574691 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574694 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574697 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574701 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574704 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574708 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574711 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574716 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574720 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574726 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574729 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574733 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574737 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574741 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574745 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574749 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574753 4743 feature_gate.go:330] unrecognized feature gate: Example Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574757 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574760 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574764 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574767 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574771 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574774 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574778 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574781 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574785 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574807 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574813 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574816 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574820 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574824 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574830 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574834 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574838 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574841 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574845 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574849 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574852 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574856 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574859 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574862 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574868 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574871 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574874 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574878 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574881 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574885 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574888 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574892 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574895 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574899 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574904 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574907 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.574911 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.574918 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.583478 4743 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.583516 4743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583653 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583667 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583677 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583685 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583694 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583703 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583713 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583721 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583729 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583737 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583745 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583752 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583760 4743 feature_gate.go:330] unrecognized feature gate: Example Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583768 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583776 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583784 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583824 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583835 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583844 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583855 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583865 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583874 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583883 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583893 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583904 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583915 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583924 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583932 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583943 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583953 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583962 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583971 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583979 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583987 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.583995 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584002 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584010 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584017 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584025 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584033 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584043 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584052 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584060 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584069 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584077 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584087 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584097 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584105 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584114 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584122 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584129 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584137 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584144 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584152 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584161 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584169 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584176 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584184 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584192 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584201 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584208 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584216 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584224 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584231 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584239 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584247 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584254 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584261 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584270 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584277 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584285 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.584298 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584531 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584547 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584556 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584565 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584572 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584580 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584588 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584598 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584607 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584616 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584627 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584637 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584647 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584656 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584665 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584673 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584681 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584689 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584697 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584704 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584712 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584720 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584730 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584743 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584753 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584762 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584769 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584777 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584785 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584826 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584836 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584846 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584855 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584863 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584871 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584878 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584886 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584893 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584904 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584912 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584920 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584928 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584936 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584945 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584954 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584962 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584971 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584979 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584987 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.584995 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585005 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585013 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585021 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585029 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585037 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585045 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585053 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585060 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585068 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585077 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585084 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585092 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585100 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585107 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585115 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585123 4743 feature_gate.go:330] unrecognized feature gate: Example Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585130 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585138 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585145 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585153 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.585161 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.585173 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.585779 4743 server.go:940] "Client rotation is on, will bootstrap in background" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.595439 4743 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.595582 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.596392 4743 server.go:997] "Starting client certificate rotation" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.596427 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.596896 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-23 06:37:55.546421604 +0000 UTC Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.597033 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.602834 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.605301 4743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.605280 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.615832 4743 log.go:25] "Validated CRI v1 runtime API" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.636890 4743 log.go:25] "Validated CRI v1 image API" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.639741 4743 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.643637 4743 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-22-13-41-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.643683 4743 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.675588 4743 manager.go:217] Machine: {Timestamp:2026-01-22 13:46:03.673473195 +0000 UTC m=+0.228516408 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3b0111c7-d257-4412-a2b4-67f25f05b313 BootID:ecca3303-8671-4d9d-9703-9ba07a2ce734 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6e:f9:08 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6e:f9:08 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:96:d2:c7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2a:6b:53 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:e6:aa:39 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8e:7f:f9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:21:29:de:5e:25 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c2:50:9b:de:dc:1b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.676032 4743 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.676405 4743 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.677673 4743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678065 4743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678121 4743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678471 4743 topology_manager.go:138] "Creating topology manager with none policy" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678489 4743 container_manager_linux.go:303] "Creating device plugin manager" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678874 4743 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.678928 4743 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.679189 4743 state_mem.go:36] "Initialized new in-memory state store" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.679333 4743 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.680518 4743 kubelet.go:418] "Attempting to sync node with API server" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.680573 4743 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.680626 4743 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.680658 4743 kubelet.go:324] "Adding apiserver pod source" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.680684 4743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.682605 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.682605 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.682701 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.682712 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.683179 4743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.683517 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684271 4743 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684704 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684725 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684732 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684739 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684750 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684758 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684765 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684775 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684783 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684802 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684812 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.684819 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.685122 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.685470 4743 server.go:1280] "Started kubelet" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.685928 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.685981 4743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.685968 4743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.686775 4743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 22 13:46:03 crc systemd[1]: Started Kubernetes Kubelet. Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689228 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689262 4743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689463 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:26:35.950650228 +0000 UTC Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689698 4743 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689710 4743 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.689878 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.689899 4743 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.690874 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.690967 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.691244 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="200ms" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.691327 4743 server.go:460] "Adding debug handlers to kubelet server" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.691193 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.53:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188d1195cb524b5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 13:46:03.685440348 +0000 UTC m=+0.240483511,LastTimestamp:2026-01-22 13:46:03.685440348 +0000 UTC m=+0.240483511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.695063 4743 factory.go:55] Registering systemd factory Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.695101 4743 factory.go:221] Registration of the systemd container factory successfully Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.696631 4743 factory.go:153] Registering CRI-O factory Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.696686 4743 factory.go:221] Registration of the crio container factory successfully Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.696840 4743 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.696890 4743 factory.go:103] Registering Raw factory Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.696922 4743 manager.go:1196] Started watching for new ooms in manager Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.703609 4743 manager.go:319] Starting recovery of all containers Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708675 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708862 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708892 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708909 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708928 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708945 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708961 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.708982 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709009 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709030 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709052 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709071 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709090 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709118 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709229 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709248 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709271 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709290 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709310 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709330 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709353 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709400 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709463 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709483 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709506 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709528 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709580 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709603 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709626 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709647 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709707 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709728 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709877 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709909 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709931 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709953 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709974 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.709995 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710010 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710028 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710043 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710058 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710074 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710103 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710119 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710137 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710155 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710177 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710195 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710217 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710235 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710251 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710305 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710336 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710365 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710392 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710409 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710425 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710469 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710483 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710499 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710536 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710551 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710566 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710582 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710596 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710610 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710626 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710647 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710670 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710684 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710701 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710714 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710728 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710769 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710808 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710824 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710842 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710859 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710877 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710914 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710939 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710956 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710971 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.710996 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711011 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711025 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711038 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711052 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711066 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711082 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711095 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711108 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711167 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711183 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711198 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711214 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711230 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711246 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711270 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711284 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711300 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711316 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711330 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711351 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711369 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711393 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711408 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711424 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711438 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711453 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711472 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711489 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711504 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711519 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711534 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711557 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711577 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711597 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711612 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711628 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711650 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711666 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711686 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711703 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711716 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711731 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711747 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711761 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711775 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711811 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711827 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711845 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711862 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711882 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711897 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711912 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711927 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711941 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711954 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711968 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.711986 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712007 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712850 4743 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712900 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712919 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712936 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712951 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712964 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712980 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.712994 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713010 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713025 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713040 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713055 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713074 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713094 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713125 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713145 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713171 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713192 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713207 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713221 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713236 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713254 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713272 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713302 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713330 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713345 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713359 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713371 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713384 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713399 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713413 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713429 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713450 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713464 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713479 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713565 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713588 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713604 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713625 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713647 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713664 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713683 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713698 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713713 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713740 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713756 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713772 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713806 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713826 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713845 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713861 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713891 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713906 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713929 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713949 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713967 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713983 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.713997 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714009 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714022 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714037 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714066 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714084 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714105 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714117 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714132 4743 reconstruct.go:97] "Volume reconstruction finished" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.714154 4743 reconciler.go:26] "Reconciler: start to sync state" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.737382 4743 manager.go:324] Recovery completed Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.741411 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.745642 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.745807 4743 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.745917 4743 kubelet.go:2335] "Starting kubelet main sync loop" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.746037 4743 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.745846 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: W0122 13:46:03.746644 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.746708 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.748496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.748581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.748613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.749375 4743 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.749462 4743 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.749533 4743 state_mem.go:36] "Initialized new in-memory state store" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.761863 4743 policy_none.go:49] "None policy: Start" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.762817 4743 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.762845 4743 state_mem.go:35] "Initializing new in-memory state store" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.790539 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.824389 4743 manager.go:334] "Starting Device Plugin manager" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.824467 4743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.824481 4743 server.go:79] "Starting device plugin registration server" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.824915 4743 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.824928 4743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.825094 4743 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.825277 4743 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.825295 4743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.834060 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.846213 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.846335 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.847325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.847479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.847580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.847950 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.848378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.848502 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849827 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849984 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.849864 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.850118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.850145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.850841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.850891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.850904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851539 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.851936 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852719 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852870 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852909 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.852912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853752 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.853779 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.854642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.854661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.854671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.892018 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="400ms" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916447 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.916959 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.925009 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.926535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.926578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.926596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:03 crc kubenswrapper[4743]: I0122 13:46:03.926625 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:03 crc kubenswrapper[4743]: E0122 13:46:03.927117 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.53:6443: connect: connection refused" node="crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018949 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.018995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.019188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.127405 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.129352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.129409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.129433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.129473 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:04 crc kubenswrapper[4743]: E0122 13:46:04.130001 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.53:6443: connect: connection refused" node="crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.178561 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.204511 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.205064 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-07b57bb17895b5b3255c99ef1d3e7650e1d3c6abec8bb2c52ac752673903ae7d WatchSource:0}: Error finding container 07b57bb17895b5b3255c99ef1d3e7650e1d3c6abec8bb2c52ac752673903ae7d: Status 404 returned error can't find the container with id 07b57bb17895b5b3255c99ef1d3e7650e1d3c6abec8bb2c52ac752673903ae7d Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.222504 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-847501ce3eec2513de37369884dc910cee6ef03fa629db24d76acffab85b064b WatchSource:0}: Error finding container 847501ce3eec2513de37369884dc910cee6ef03fa629db24d76acffab85b064b: Status 404 returned error can't find the container with id 847501ce3eec2513de37369884dc910cee6ef03fa629db24d76acffab85b064b Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.222870 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.234780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.240462 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-152832f38d5ee5bdd2609272d97d8f75b3ba8b29e58259855a0f5cc6cee4fb5b WatchSource:0}: Error finding container 152832f38d5ee5bdd2609272d97d8f75b3ba8b29e58259855a0f5cc6cee4fb5b: Status 404 returned error can't find the container with id 152832f38d5ee5bdd2609272d97d8f75b3ba8b29e58259855a0f5cc6cee4fb5b Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.241615 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.253651 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f5260eda3232fca8e9732de98252131ee02a3fba60d6006e4ce0820e7600209b WatchSource:0}: Error finding container f5260eda3232fca8e9732de98252131ee02a3fba60d6006e4ce0820e7600209b: Status 404 returned error can't find the container with id f5260eda3232fca8e9732de98252131ee02a3fba60d6006e4ce0820e7600209b Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.254805 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e6fdc01b01121041c441d0ddf358d24798c86e9700fabdd8b6bc6e66c29792b2 WatchSource:0}: Error finding container e6fdc01b01121041c441d0ddf358d24798c86e9700fabdd8b6bc6e66c29792b2: Status 404 returned error can't find the container with id e6fdc01b01121041c441d0ddf358d24798c86e9700fabdd8b6bc6e66c29792b2 Jan 22 13:46:04 crc kubenswrapper[4743]: E0122 13:46:04.293422 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="800ms" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.530200 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.533232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.533263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.533275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.533297 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:04 crc kubenswrapper[4743]: E0122 13:46:04.533720 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.53:6443: connect: connection refused" node="crc" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.687193 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.690395 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 16:53:14.149446479 +0000 UTC Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.753180 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="21282f35b48eb852e12a2c8786e51e562481376177b16a42cbfd0d340db9e4a5" exitCode=0 Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.753245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"21282f35b48eb852e12a2c8786e51e562481376177b16a42cbfd0d340db9e4a5"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.753313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e6fdc01b01121041c441d0ddf358d24798c86e9700fabdd8b6bc6e66c29792b2"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.753409 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.754400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.754436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.754447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.755011 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f1bebb26d5eb8dfff0c607c58ea316e39f1ad83ac56bb0c76f5d80e164071adb" exitCode=0 Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.755169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f1bebb26d5eb8dfff0c607c58ea316e39f1ad83ac56bb0c76f5d80e164071adb"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.755297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f5260eda3232fca8e9732de98252131ee02a3fba60d6006e4ce0820e7600209b"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.755473 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756569 4743 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c4047dd206a5ee6351b58710ff5d5182a54cac6ff7f085e844af7d8be10af701" exitCode=0 Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c4047dd206a5ee6351b58710ff5d5182a54cac6ff7f085e844af7d8be10af701"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"152832f38d5ee5bdd2609272d97d8f75b3ba8b29e58259855a0f5cc6cee4fb5b"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756817 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.756851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.757573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.757593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.757605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.758502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bfd37a286a817f44e77fd221d3f495b76049d2cddeff485afe9290b6f9c4c859"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.758541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"847501ce3eec2513de37369884dc910cee6ef03fa629db24d76acffab85b064b"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.760369 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322" exitCode=0 Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.760392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.760422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07b57bb17895b5b3255c99ef1d3e7650e1d3c6abec8bb2c52ac752673903ae7d"} Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.760515 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.761207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.761241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.761251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.762922 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.763959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.763995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:04 crc kubenswrapper[4743]: I0122 13:46:04.764022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.776885 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:04 crc kubenswrapper[4743]: E0122 13:46:04.777008 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:04 crc kubenswrapper[4743]: W0122 13:46:04.815306 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:04 crc kubenswrapper[4743]: E0122 13:46:04.815416 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:05 crc kubenswrapper[4743]: E0122 13:46:05.094104 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="1.6s" Jan 22 13:46:05 crc kubenswrapper[4743]: W0122 13:46:05.161387 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:05 crc kubenswrapper[4743]: E0122 13:46:05.161486 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:05 crc kubenswrapper[4743]: W0122 13:46:05.250429 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.53:6443: connect: connection refused Jan 22 13:46:05 crc kubenswrapper[4743]: E0122 13:46:05.250546 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.333831 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.335044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.335093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.335105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.335130 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:05 crc kubenswrapper[4743]: E0122 13:46:05.335461 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.53:6443: connect: connection refused" node="crc" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.626431 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 13:46:05 crc kubenswrapper[4743]: E0122 13:46:05.627529 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.53:6443: connect: connection refused" logger="UnhandledError" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.690608 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 23:07:35.243771706 +0000 UTC Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.764931 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="02fb36a791debc8bc993df8c0daad75cb0fb25348cdf068af5f8c5a9bec1eef7" exitCode=0 Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.765111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"02fb36a791debc8bc993df8c0daad75cb0fb25348cdf068af5f8c5a9bec1eef7"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.765327 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.767037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.767075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.767088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.776097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c438d85cca4c99132ec448e135fa0d48ee945e7952e342a0f6ee17d50e4efdd4"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.776307 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.777452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.777497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.777507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.780119 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dd02527ac0588590e67aff1d82fb4a5c30d78673e01edccc78d87716a073edaf"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.780163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"87c5d227109688a5b33886167617d2ace33841850edbeee880d08760a37c41c8"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.780181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5cff01609fe1a341495cbe0ba5953a11790c42f7cce482e181ddf6dfece7c5f6"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.780309 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.782201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.782247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.782263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.783882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c3612be276f1dcc5a019acd4265a843ac2b7beadf2dc429f185cbe0a788e54ba"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.783946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb73770703be3096013ed7bfbe25b17e02bdeb82712726422baae59f3951219e"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.783963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cc186d76b26731ec86bb1f941ea1366a0cca80de6a6a5a5aa3b7e22c746ab22d"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.784001 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.786497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.786526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.786536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790705 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790725 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a"} Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.790873 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.791649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.791676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:05 crc kubenswrapper[4743]: I0122 13:46:05.791687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.690965 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:17:18.576895347 +0000 UTC Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.788415 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.796922 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="12b33e5f53a9218f549b80f32790b34fef4f8e66648feb89ce8187b86c9ab06b" exitCode=0 Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797095 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"12b33e5f53a9218f549b80f32790b34fef4f8e66648feb89ce8187b86c9ab06b"} Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797175 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797414 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797112 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.798064 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.797175 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.798658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.798738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.798756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.799848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.936506 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.938270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.938327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.938339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:06 crc kubenswrapper[4743]: I0122 13:46:06.938376 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.210169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.220861 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.691729 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:16:06.369940556 +0000 UTC Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"937d8ad783551406f7ea633f5674dad74ff37bf463270d4128321c3903cd8b5b"} Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807372 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88119dc01795449443e5f546588121c4dbdb490e70978826e5f8ddda5074a20e"} Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807421 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7e1d8c70330031c1a345041dde73531d63baedce80d63dcc5be103c1d2c21637"} Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7db9037936b95d31484662424c28c2ed3982aef3203bb0c38f8783fe1186dc8e"} Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.807469 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.808927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.808995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.809019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.809036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.809099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:07 crc kubenswrapper[4743]: I0122 13:46:07.809121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.403435 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.403697 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.405214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.405260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.405271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.691914 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:59:00.554548586 +0000 UTC Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.818022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b3d73e53afe1e8cb67ac32cbde4d8c56446ad2abadd6bb74fe011f50e2894665"} Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.818312 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.819851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.819928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:08 crc kubenswrapper[4743]: I0122 13:46:08.819958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.687608 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.692726 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 16:59:59.586835596 +0000 UTC Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.821115 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.822478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.822563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:09 crc kubenswrapper[4743]: I0122 13:46:09.822586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.106903 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.211342 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.211466 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.693115 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:45:01.564839303 +0000 UTC Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.823870 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.825343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.825406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:10 crc kubenswrapper[4743]: I0122 13:46:10.825452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:11 crc kubenswrapper[4743]: I0122 13:46:11.693946 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:15:21.225574423 +0000 UTC Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.026508 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.026716 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.028054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.028125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.028143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:12 crc kubenswrapper[4743]: I0122 13:46:12.694172 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:51:43.525466582 +0000 UTC Jan 22 13:46:13 crc kubenswrapper[4743]: I0122 13:46:13.694699 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:18:23.677034331 +0000 UTC Jan 22 13:46:13 crc kubenswrapper[4743]: E0122 13:46:13.834340 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.132897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.133108 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.134462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.134514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.134525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:14 crc kubenswrapper[4743]: I0122 13:46:14.695615 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:55:02.89995122 +0000 UTC Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.115368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.115959 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.118144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.118208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.118224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.125680 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.161766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.688104 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.696645 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:29:50.527667732 +0000 UTC Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.836680 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.837713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.837753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.837766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.840363 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.969510 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.969849 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.971539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.971593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:15 crc kubenswrapper[4743]: I0122 13:46:15.971616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:16 crc kubenswrapper[4743]: I0122 13:46:16.148460 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 22 13:46:16 crc kubenswrapper[4743]: I0122 13:46:16.148538 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 22 13:46:16 crc kubenswrapper[4743]: I0122 13:46:16.156953 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 22 13:46:16 crc kubenswrapper[4743]: I0122 13:46:16.157099 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.078629 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 22:34:28.280835862 +0000 UTC Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.080908 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.082125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.082174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.082193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.231993 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]log ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]etcd ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/generic-apiserver-start-informers ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-filter ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-apiextensions-informers ok Jan 22 13:46:17 crc kubenswrapper[4743]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/crd-informer-synced ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-system-namespaces-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 22 13:46:17 crc kubenswrapper[4743]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/bootstrap-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/start-kube-aggregator-informers ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-registration-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-discovery-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]autoregister-completion ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-openapi-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 22 13:46:17 crc kubenswrapper[4743]: livez check failed Jan 22 13:46:17 crc kubenswrapper[4743]: I0122 13:46:17.232057 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.079638 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:08:44.171465798 +0000 UTC Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.084198 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.085981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.086040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.086059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.838329 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 22 13:46:18 crc kubenswrapper[4743]: I0122 13:46:18.838426 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 22 13:46:19 crc kubenswrapper[4743]: I0122 13:46:19.080711 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:52:51.450889255 +0000 UTC Jan 22 13:46:20 crc kubenswrapper[4743]: I0122 13:46:20.081650 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 04:53:46.998935479 +0000 UTC Jan 22 13:46:20 crc kubenswrapper[4743]: I0122 13:46:20.211278 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 13:46:20 crc kubenswrapper[4743]: I0122 13:46:20.211374 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.082551 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:59:36.392270944 +0000 UTC Jan 22 13:46:21 crc kubenswrapper[4743]: E0122 13:46:21.136620 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.140971 4743 trace.go:236] Trace[967920401]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 13:46:06.842) (total time: 14298ms): Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[967920401]: ---"Objects listed" error: 14298ms (13:46:21.140) Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[967920401]: [14.298222082s] [14.298222082s] END Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.141013 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.141326 4743 trace.go:236] Trace[651511605]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 13:46:07.564) (total time: 13577ms): Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[651511605]: ---"Objects listed" error: 13577ms (13:46:21.141) Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[651511605]: [13.577183455s] [13.577183455s] END Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.141370 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.142092 4743 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.142578 4743 trace.go:236] Trace[77155575]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 13:46:06.719) (total time: 14423ms): Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[77155575]: ---"Objects listed" error: 14423ms (13:46:21.142) Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[77155575]: [14.423228026s] [14.423228026s] END Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.142611 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 13:46:21 crc kubenswrapper[4743]: E0122 13:46:21.142784 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.142978 4743 trace.go:236] Trace[941174723]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 13:46:06.866) (total time: 14276ms): Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[941174723]: ---"Objects listed" error: 14276ms (13:46:21.142) Jan 22 13:46:21 crc kubenswrapper[4743]: Trace[941174723]: [14.276349306s] [14.276349306s] END Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.142998 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 22 13:46:21 crc kubenswrapper[4743]: I0122 13:46:21.156695 4743 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.083086 4743 apiserver.go:52] "Watching apiserver" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.083116 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:05:50.390396886 +0000 UTC Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.088072 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.088700 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.089275 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.089392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.089720 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.090258 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.090346 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.090417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.090669 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.090672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.090848 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.093624 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.094132 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.094474 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.094634 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.097098 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.097386 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.098898 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.099434 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.101432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.101908 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.106287 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9" exitCode=255 Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.106338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9"} Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.122542 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.123291 4743 scope.go:117] "RemoveContainer" containerID="dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.134129 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.150778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.150984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151260 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.151915 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.152261 4743 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.152763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.154358 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.157958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.158159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.165287 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.165316 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.165331 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.165392 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:22.66537069 +0000 UTC m=+19.220413853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.167422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.167676 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.168620 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.170775 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.170918 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.171030 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.171160 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:22.671142998 +0000 UTC m=+19.226186161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.171301 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.175851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.182411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f4ccb0-f73c-4886-ba33-0e37b40563fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T13:46:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0122 13:46:21.161890 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0122 13:46:21.162004 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 13:46:21.164820 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1639856425/tls.crt::/tmp/serving-cert-1639856425/tls.key\\\\\\\"\\\\nI0122 13:46:21.552122 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 13:46:21.554385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 13:46:21.554446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 13:46:21.554486 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 13:46:21.554535 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 13:46:21.558446 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 13:46:21.558474 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558480 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 13:46:21.558487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 13:46:21.558506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 13:46:21.558509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 13:46:21.558559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 13:46:21.560263 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T13:46:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.190973 4743 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.194137 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.211242 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.222622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.232730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.237534 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.247995 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252031 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252210 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252275 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252370 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252394 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252473 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252552 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252598 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252658 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252736 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252751 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252831 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252857 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252879 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252945 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.252996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253045 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253217 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253241 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253264 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253312 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253390 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253416 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253447 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253623 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253763 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253817 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254023 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254136 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254192 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254290 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254350 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254534 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254552 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254587 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254605 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254638 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254705 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254868 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255057 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255160 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255178 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255211 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255227 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255329 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255416 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255474 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255590 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255607 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255694 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255759 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256255 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256390 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256408 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256443 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256534 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256581 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256598 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256828 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256870 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257246 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257268 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257288 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.253679 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254047 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.254805 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255455 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255634 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.255804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.256999 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257336 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257381 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257396 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.257616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258181 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258248 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258754 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.258894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.263549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.263771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.263891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.263917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.263970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264240 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264227 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f4ccb0-f73c-4886-ba33-0e37b40563fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T13:46:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0122 13:46:21.161890 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0122 13:46:21.162004 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 13:46:21.164820 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1639856425/tls.crt::/tmp/serving-cert-1639856425/tls.key\\\\\\\"\\\\nI0122 13:46:21.552122 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 13:46:21.554385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 13:46:21.554446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 13:46:21.554486 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 13:46:21.554535 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 13:46:21.558446 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 13:46:21.558474 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558480 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 13:46:21.558487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 13:46:21.558506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 13:46:21.558509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 13:46:21.558559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 13:46:21.560263 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T13:46:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.264917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259213 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259383 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.259879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:22.759776518 +0000 UTC m=+19.314819771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.259673 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260274 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.260864 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.261491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.261617 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.261616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.261659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.265747 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.266248 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.266401 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.266570 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.266733 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.266761 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.267174 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.267566 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.267665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.267922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.268259 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.268490 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.268836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.268902 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.269129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.269220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.269427 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.269511 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.269621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.269870 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.270178 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:22.770148173 +0000 UTC m=+19.325191396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.270251 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:22.770240255 +0000 UTC m=+19.325283538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.270239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.270601 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.270763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.270853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.273260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.273490 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.273720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.273982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.274002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.274018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.274456 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.275023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.275510 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.275976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.276002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.276236 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.276349 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.276544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277075 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277579 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277635 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.277709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278178 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278867 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.278834 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.279509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.279548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.279546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.280475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.280830 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.281024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.283964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.284695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.285636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.285684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.286671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.287037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.287504 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.292305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.292340 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.293213 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.293511 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.293733 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.294285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.295120 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.295470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.296044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.296288 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.296283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.296555 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.296814 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297089 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297165 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297595 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.297990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.298028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.298671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.298710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.298874 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299154 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299258 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.299395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.300777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301560 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301848 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.301963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302349 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302625 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302892 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.302906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303833 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.303893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.304124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.307054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.318960 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.319981 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.326078 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.328264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.330626 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.331706 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357758 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357817 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357830 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357842 4743 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357857 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357867 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357878 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357890 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357900 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357910 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357921 4743 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357931 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357941 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357952 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357963 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357973 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357987 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.357998 4743 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358009 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358020 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358030 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358041 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358052 4743 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358063 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358074 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358084 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358095 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358106 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358117 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358127 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358138 4743 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358149 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358171 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358181 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358193 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358204 4743 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358215 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358226 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358237 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358249 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358263 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358274 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358286 4743 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358297 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358309 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358321 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358331 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358342 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358352 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358363 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358375 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358389 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358399 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358409 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358420 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358430 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358440 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358452 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358462 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358473 4743 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358484 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358495 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358505 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358516 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358527 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358537 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358549 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358559 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358571 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358583 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358594 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358604 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358615 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358626 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358637 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358648 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358658 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358689 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358702 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358713 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358726 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358738 4743 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358751 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358763 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358774 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358785 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358817 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358828 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358839 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358854 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358865 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358876 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358887 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358901 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358912 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358924 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358934 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358944 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358955 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358965 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358976 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358988 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.358998 4743 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359008 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359020 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359031 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359041 4743 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359052 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359063 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359073 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359083 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359093 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359103 4743 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359114 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359125 4743 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359136 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359146 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359157 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359167 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359179 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359189 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359200 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359211 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359221 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359231 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359243 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359255 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359267 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359278 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359289 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359300 4743 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359310 4743 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359320 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359331 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359341 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359351 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359362 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359372 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359384 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359394 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359405 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359417 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359427 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359438 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359448 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359460 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359471 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359482 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359522 4743 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359533 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359544 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359556 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359568 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359579 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359589 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359599 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359610 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359621 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359631 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359641 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359652 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359663 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359673 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359687 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359698 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359709 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359719 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359729 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359740 4743 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359750 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359760 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359770 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359780 4743 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359808 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359818 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359830 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359841 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359852 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359864 4743 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359873 4743 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359883 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359895 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359906 4743 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359917 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359928 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359938 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359948 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359960 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359969 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359979 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.359991 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.412309 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.421181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 13:46:22 crc kubenswrapper[4743]: W0122 13:46:22.422904 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-46d940225cc21fd53fafea4ce0eaf09118e0719775ea576cc817d869a3f7d811 WatchSource:0}: Error finding container 46d940225cc21fd53fafea4ce0eaf09118e0719775ea576cc817d869a3f7d811: Status 404 returned error can't find the container with id 46d940225cc21fd53fafea4ce0eaf09118e0719775ea576cc817d869a3f7d811 Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.429708 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 13:46:22 crc kubenswrapper[4743]: W0122 13:46:22.443458 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-29c208a010fe992b76180e955a16b2f02ef0c8b344c52f514358c75450bd50c7 WatchSource:0}: Error finding container 29c208a010fe992b76180e955a16b2f02ef0c8b344c52f514358c75450bd50c7: Status 404 returned error can't find the container with id 29c208a010fe992b76180e955a16b2f02ef0c8b344c52f514358c75450bd50c7 Jan 22 13:46:22 crc kubenswrapper[4743]: W0122 13:46:22.450415 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-14c9a4839937f42d47ec9fa822c5f0d993c9aec8cc81138390ba1d7245b0867b WatchSource:0}: Error finding container 14c9a4839937f42d47ec9fa822c5f0d993c9aec8cc81138390ba1d7245b0867b: Status 404 returned error can't find the container with id 14c9a4839937f42d47ec9fa822c5f0d993c9aec8cc81138390ba1d7245b0867b Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.763859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.763954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.763978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764068 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:23.764040692 +0000 UTC m=+20.319083855 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764100 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764116 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764135 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764180 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:23.764164546 +0000 UTC m=+20.319207709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764190 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764228 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764239 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.764273 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:23.764265568 +0000 UTC m=+20.319308731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.865478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:22 crc kubenswrapper[4743]: I0122 13:46:22.865585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.865705 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.865720 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.865857 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:23.865837173 +0000 UTC m=+20.420880336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:22 crc kubenswrapper[4743]: E0122 13:46:22.865960 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:23.865931515 +0000 UTC m=+20.420974718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.084282 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 16:37:52.768828155 +0000 UTC Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.111188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"14c9a4839937f42d47ec9fa822c5f0d993c9aec8cc81138390ba1d7245b0867b"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.114193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0990d3dc2747f44c5b465272affd010b7d86c27eece3c5d5a088b76cc10f352f"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.114311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0b784653ecd4a8e7710705aa4fbc1f2f46cd41e6eb32d5b22477ed93bb9c19ad"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.114361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"29c208a010fe992b76180e955a16b2f02ef0c8b344c52f514358c75450bd50c7"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.116863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8bf73356b342fcdceb0770b6413492824391f5f769b7a662a5f2b3c906d95151"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.116987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"46d940225cc21fd53fafea4ce0eaf09118e0719775ea576cc817d869a3f7d811"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.119246 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.121539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954"} Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.122405 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.127769 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.136472 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f4ccb0-f73c-4886-ba33-0e37b40563fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T13:46:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0122 13:46:21.161890 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0122 13:46:21.162004 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 13:46:21.164820 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1639856425/tls.crt::/tmp/serving-cert-1639856425/tls.key\\\\\\\"\\\\nI0122 13:46:21.552122 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 13:46:21.554385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 13:46:21.554446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 13:46:21.554486 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 13:46:21.554535 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 13:46:21.558446 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 13:46:21.558474 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558480 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 13:46:21.558487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 13:46:21.558506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 13:46:21.558509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 13:46:21.558559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 13:46:21.560263 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T13:46:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.155651 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0990d3dc2747f44c5b465272affd010b7d86c27eece3c5d5a088b76cc10f352f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b784653ecd4a8e7710705aa4fbc1f2f46cd41e6eb32d5b22477ed93bb9c19ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.171018 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.187806 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.202515 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.218142 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.237732 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.284302 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.305974 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f4ccb0-f73c-4886-ba33-0e37b40563fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T13:46:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0122 13:46:21.161890 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0122 13:46:21.162004 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 13:46:21.164820 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1639856425/tls.crt::/tmp/serving-cert-1639856425/tls.key\\\\\\\"\\\\nI0122 13:46:21.552122 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 13:46:21.554385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 13:46:21.554446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 13:46:21.554486 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 13:46:21.554535 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 13:46:21.558446 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 13:46:21.558474 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558480 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 13:46:21.558487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 13:46:21.558506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 13:46:21.558509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 13:46:21.558559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 13:46:21.560263 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T13:46:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.318783 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0990d3dc2747f44c5b465272affd010b7d86c27eece3c5d5a088b76cc10f352f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b784653ecd4a8e7710705aa4fbc1f2f46cd41e6eb32d5b22477ed93bb9c19ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.329083 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf73356b342fcdceb0770b6413492824391f5f769b7a662a5f2b3c906d95151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.340612 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.350603 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.362596 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.747195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.747328 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.747651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.747718 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.747859 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.747933 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.751437 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.752456 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.753238 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.753965 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.754604 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.755387 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.757373 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.758059 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.759286 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.759971 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.760495 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.761557 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.762076 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.763008 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.763538 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.767105 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.767958 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.768353 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.769460 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.770068 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.770932 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.771527 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.772021 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.772901 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf73356b342fcdceb0770b6413492824391f5f769b7a662a5f2b3c906d95151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.772995 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:25.77297919 +0000 UTC m=+22.328022353 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.772933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.773182 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.773184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773287 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.773297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773307 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773322 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773367 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:25.773352181 +0000 UTC m=+22.328395354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773408 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773421 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773431 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.773467 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:25.773458033 +0000 UTC m=+22.328501196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.773676 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.774842 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.775567 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.776472 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.777125 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.777989 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.778442 4743 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.778541 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.780228 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.781070 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.781450 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.783322 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.784367 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.785004 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.786196 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.787046 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.788180 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.788742 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.788918 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.789651 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.790257 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.791096 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.791624 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.792476 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.793177 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.793975 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.794393 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.795208 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.795687 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.796241 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.797145 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.801390 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.813719 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.835966 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f4ccb0-f73c-4886-ba33-0e37b40563fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T13:46:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0122 13:46:21.161890 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0122 13:46:21.162004 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 13:46:21.164820 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1639856425/tls.crt::/tmp/serving-cert-1639856425/tls.key\\\\\\\"\\\\nI0122 13:46:21.552122 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 13:46:21.554385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 13:46:21.554446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 13:46:21.554486 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 13:46:21.554535 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 13:46:21.558446 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 13:46:21.558474 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558480 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 13:46:21.558484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 13:46:21.558487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 13:46:21.558506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 13:46:21.558509 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 13:46:21.558559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 13:46:21.560263 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T13:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T13:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T13:46:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.852370 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0990d3dc2747f44c5b465272affd010b7d86c27eece3c5d5a088b76cc10f352f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b784653ecd4a8e7710705aa4fbc1f2f46cd41e6eb32d5b22477ed93bb9c19ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T13:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.868117 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T13:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T13:46:23Z is after 2025-08-24T17:21:41Z" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.874451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:23 crc kubenswrapper[4743]: I0122 13:46:23.874518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.874621 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.874631 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.874721 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:25.874676688 +0000 UTC m=+22.429719861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:23 crc kubenswrapper[4743]: E0122 13:46:23.874749 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:25.87473642 +0000 UTC m=+22.429779603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.085334 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:04:29.456668642 +0000 UTC Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.343502 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.345182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.345219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.345228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.345285 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.351305 4743 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.351548 4743 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.352894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.352944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.352957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.352974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.352987 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T13:46:24Z","lastTransitionTime":"2026-01-22T13:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.380572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.380642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.380663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.380691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 13:46:24 crc kubenswrapper[4743]: I0122 13:46:24.380709 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T13:46:24Z","lastTransitionTime":"2026-01-22T13:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.085551 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 22:33:39.403097214 +0000 UTC Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.085628 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.093323 4743 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.127688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"621856cf6eb99c10895577584e89e98930d155249dd4d434331f17a4ac7391df"} Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.256390 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.256373596 podStartE2EDuration="3.256373596s" podCreationTimestamp="2026-01-22 13:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:25.255682817 +0000 UTC m=+21.810725990" watchObservedRunningTime="2026-01-22 13:46:25.256373596 +0000 UTC m=+21.811416759" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.746573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.746608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.746680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.746748 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.746921 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.747056 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.792257 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.792340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.792367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792552 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792575 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792588 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792621 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792651 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792666 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792644 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.792590106 +0000 UTC m=+26.347633309 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792720 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.792703639 +0000 UTC m=+26.347746802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.792739 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.79273277 +0000 UTC m=+26.347775933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.894146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.894247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.894290 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.894384 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.894363516 +0000 UTC m=+26.449406679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.894443 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: E0122 13:46:25.894538 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.89450932 +0000 UTC m=+26.449552523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:25 crc kubenswrapper[4743]: I0122 13:46:25.996200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 22 13:46:26 crc kubenswrapper[4743]: I0122 13:46:26.010620 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 22 13:46:26 crc kubenswrapper[4743]: I0122 13:46:26.012840 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.215948 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.220340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.227761 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.253278 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.253257278 podStartE2EDuration="1.253257278s" podCreationTimestamp="2026-01-22 13:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:27.252461476 +0000 UTC m=+23.807504649" watchObservedRunningTime="2026-01-22 13:46:27.253257278 +0000 UTC m=+23.808300471" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.270775 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.270754688 podStartE2EDuration="270.754688ms" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:27.270605543 +0000 UTC m=+23.825648726" watchObservedRunningTime="2026-01-22 13:46:27.270754688 +0000 UTC m=+23.825797851" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.744912 4743 csr.go:261] certificate signing request csr-wqsfl is approved, waiting to be issued Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.746364 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2kvgp"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.746921 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.748715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.749375 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.750903 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.750925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.750935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.751019 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.751106 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.751199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.751922 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.777351 4743 csr.go:257] certificate signing request csr-wqsfl is issued Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.867922 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hp5b2"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.868175 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:27 crc kubenswrapper[4743]: W0122 13:46:27.869966 4743 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.870012 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:27 crc kubenswrapper[4743]: W0122 13:46:27.870492 4743 reflector.go:561] object-"openshift-image-registry"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.870521 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:27 crc kubenswrapper[4743]: W0122 13:46:27.871587 4743 reflector.go:561] object-"openshift-image-registry"/"image-registry-certificates": failed to list *v1.ConfigMap: configmaps "image-registry-certificates" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.871627 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-certificates\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-registry-certificates\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:27 crc kubenswrapper[4743]: W0122 13:46:27.872054 4743 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Jan 22 13:46:27 crc kubenswrapper[4743]: E0122 13:46:27.872082 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.912084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccpl6\" (UniqueName: \"kubernetes.io/projected/5908c22b-d26e-4d00-a28c-07ae003f4ff2-kube-api-access-ccpl6\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.912173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5908c22b-d26e-4d00-a28c-07ae003f4ff2-hosts-file\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.944527 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4zbb8"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.944911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zbb8" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.946638 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.947317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.947385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.947635 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.948155 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.968731 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hqgk7"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.969181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.971578 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.971633 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.971880 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.972014 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.974710 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.983201 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4ddzn"] Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.983804 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.985622 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 13:46:27 crc kubenswrapper[4743]: I0122 13:46:27.985629 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb38e50b-7f75-4a79-9d18-673d43c87f84-serviceca\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5908c22b-d26e-4d00-a28c-07ae003f4ff2-hosts-file\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccpl6\" (UniqueName: \"kubernetes.io/projected/5908c22b-d26e-4d00-a28c-07ae003f4ff2-kube-api-access-ccpl6\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbcr\" (UniqueName: \"kubernetes.io/projected/eb38e50b-7f75-4a79-9d18-673d43c87f84-kube-api-access-zfbcr\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb38e50b-7f75-4a79-9d18-673d43c87f84-host\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.013352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5908c22b-d26e-4d00-a28c-07ae003f4ff2-hosts-file\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.021239 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sljgz"] Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.021589 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.021644 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.031476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccpl6\" (UniqueName: \"kubernetes.io/projected/5908c22b-d26e-4d00-a28c-07ae003f4ff2-kube-api-access-ccpl6\") pod \"node-resolver-2kvgp\" (UID: \"5908c22b-d26e-4d00-a28c-07ae003f4ff2\") " pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.057822 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gcj8q"] Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.058622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.060131 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2kvgp" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.060689 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.060936 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.060981 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.061756 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.064263 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.064454 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.064707 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.077611 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw"] Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.078496 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.086950 4743 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.087001 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.086962 4743 reflector.go:561] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": failed to list *v1.Secret: secrets "cluster-version-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.087032 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.087038 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-version-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.087131 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-852sl\" (UniqueName: \"kubernetes.io/projected/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-kube-api-access-852sl\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113819 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-daemon-config\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3aeba9ba-3a5a-4885-8540-d295aadb311b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzpst\" (UniqueName: \"kubernetes.io/projected/3aeba9ba-3a5a-4885-8540-d295aadb311b-kube-api-access-xzpst\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-socket-dir-parent\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-k8s-cni-cncf-io\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cnibin\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-os-release\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-system-cni-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.113968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkvb\" (UniqueName: \"kubernetes.io/projected/7a8cdd6d-befe-47b1-b26f-965fbc647be0-kube-api-access-9vkvb\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-cnibin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114034 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-kubelet\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114048 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-system-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-cni-binary-copy\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-multus\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-etc-kubernetes\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-bin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkkn\" (UniqueName: \"kubernetes.io/projected/1a63ac85-9a00-4381-aa80-3da86d5483aa-kube-api-access-6xkkn\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3aeba9ba-3a5a-4885-8540-d295aadb311b-rootfs\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbcr\" (UniqueName: \"kubernetes.io/projected/eb38e50b-7f75-4a79-9d18-673d43c87f84-kube-api-access-zfbcr\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb38e50b-7f75-4a79-9d18-673d43c87f84-host\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3aeba9ba-3a5a-4885-8540-d295aadb311b-proxy-tls\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114328 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-netns\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-hostroot\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-multus-certs\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb38e50b-7f75-4a79-9d18-673d43c87f84-serviceca\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-conf-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-os-release\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.114634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb38e50b-7f75-4a79-9d18-673d43c87f84-host\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.142701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2kvgp" event={"ID":"5908c22b-d26e-4d00-a28c-07ae003f4ff2","Type":"ContainerStarted","Data":"3b184320520f7b0eae269de482005d49e618eaee0a4b10de53b47bcdb5734bec"} Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-daemon-config\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-852sl\" (UniqueName: \"kubernetes.io/projected/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-kube-api-access-852sl\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215325 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzpst\" (UniqueName: \"kubernetes.io/projected/3aeba9ba-3a5a-4885-8540-d295aadb311b-kube-api-access-xzpst\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215375 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.215562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3aeba9ba-3a5a-4885-8540-d295aadb311b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-daemon-config\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-socket-dir-parent\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-k8s-cni-cncf-io\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-socket-dir-parent\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-os-release\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-k8s-cni-cncf-io\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-os-release\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cnibin\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cnibin\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-system-cni-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkvb\" (UniqueName: \"kubernetes.io/projected/7a8cdd6d-befe-47b1-b26f-965fbc647be0-kube-api-access-9vkvb\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216881 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-system-cni-dir\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-cnibin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-kubelet\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3aeba9ba-3a5a-4885-8540-d295aadb311b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-cnibin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217023 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee8391d-ed52-4e21-b9e9-b55b77407c85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.216976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-kubelet\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-system-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-cni-binary-copy\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217382 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-system-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vjn\" (UniqueName: \"kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217439 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-etc-kubernetes\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-multus\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.217483 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3aeba9ba-3a5a-4885-8540-d295aadb311b-rootfs\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3aeba9ba-3a5a-4885-8540-d295aadb311b-rootfs\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-multus\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-etc-kubernetes\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.217564 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs podName:ae9ad2e0-8b28-4352-8ced-7133f8b1c88d nodeName:}" failed. No retries permitted until 2026-01-22 13:46:28.717537723 +0000 UTC m=+25.272580896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs") pod "network-metrics-daemon-sljgz" (UID: "ae9ad2e0-8b28-4352-8ced-7133f8b1c88d") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-bin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkkn\" (UniqueName: \"kubernetes.io/projected/1a63ac85-9a00-4381-aa80-3da86d5483aa-kube-api-access-6xkkn\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-var-lib-cni-bin\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217683 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a63ac85-9a00-4381-aa80-3da86d5483aa-cni-binary-copy\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217709 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cee8391d-ed52-4e21-b9e9-b55b77407c85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3aeba9ba-3a5a-4885-8540-d295aadb311b-proxy-tls\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.217985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-netns\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-hostroot\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-multus-certs\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218052 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-os-release\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-conf-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cee8391d-ed52-4e21-b9e9-b55b77407c85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-conf-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218355 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-hostroot\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218361 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-netns\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a8cdd6d-befe-47b1-b26f-965fbc647be0-os-release\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-host-run-multus-certs\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a63ac85-9a00-4381-aa80-3da86d5483aa-multus-cni-dir\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.218850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a8cdd6d-befe-47b1-b26f-965fbc647be0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.222081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3aeba9ba-3a5a-4885-8540-d295aadb311b-proxy-tls\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.231547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzpst\" (UniqueName: \"kubernetes.io/projected/3aeba9ba-3a5a-4885-8540-d295aadb311b-kube-api-access-xzpst\") pod \"machine-config-daemon-hqgk7\" (UID: \"3aeba9ba-3a5a-4885-8540-d295aadb311b\") " pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.232318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkkn\" (UniqueName: \"kubernetes.io/projected/1a63ac85-9a00-4381-aa80-3da86d5483aa-kube-api-access-6xkkn\") pod \"multus-4zbb8\" (UID: \"1a63ac85-9a00-4381-aa80-3da86d5483aa\") " pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.236517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-852sl\" (UniqueName: \"kubernetes.io/projected/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-kube-api-access-852sl\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.237260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkvb\" (UniqueName: \"kubernetes.io/projected/7a8cdd6d-befe-47b1-b26f-965fbc647be0-kube-api-access-9vkvb\") pod \"multus-additional-cni-plugins-4ddzn\" (UID: \"7a8cdd6d-befe-47b1-b26f-965fbc647be0\") " pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.255824 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zbb8" Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.267757 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a63ac85_9a00_4381_aa80_3da86d5483aa.slice/crio-3f9c74ce2756a86de9baffba9a5daa9410ee80eefcc0d5fb1ad55bd44f5e9656 WatchSource:0}: Error finding container 3f9c74ce2756a86de9baffba9a5daa9410ee80eefcc0d5fb1ad55bd44f5e9656: Status 404 returned error can't find the container with id 3f9c74ce2756a86de9baffba9a5daa9410ee80eefcc0d5fb1ad55bd44f5e9656 Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.279643 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.292920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.296958 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aeba9ba_3a5a_4885_8540_d295aadb311b.slice/crio-d9103a8306cce9e535711339fbf6b907c618782d872e9931d31fd9701bbc683e WatchSource:0}: Error finding container d9103a8306cce9e535711339fbf6b907c618782d872e9931d31fd9701bbc683e: Status 404 returned error can't find the container with id d9103a8306cce9e535711339fbf6b907c618782d872e9931d31fd9701bbc683e Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.308253 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8cdd6d_befe_47b1_b26f_965fbc647be0.slice/crio-80b1256564c2fc3eaa35bb636df833a52957315b9198850e23bbcdbf1961f693 WatchSource:0}: Error finding container 80b1256564c2fc3eaa35bb636df833a52957315b9198850e23bbcdbf1961f693: Status 404 returned error can't find the container with id 80b1256564c2fc3eaa35bb636df833a52957315b9198850e23bbcdbf1961f693 Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vjn\" (UniqueName: \"kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cee8391d-ed52-4e21-b9e9-b55b77407c85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319796 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cee8391d-ed52-4e21-b9e9-b55b77407c85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319914 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319932 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.319997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320078 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee8391d-ed52-4e21-b9e9-b55b77407c85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320458 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320486 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320586 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cee8391d-ed52-4e21-b9e9-b55b77407c85-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.320629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.321041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cee8391d-ed52-4e21-b9e9-b55b77407c85-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.324468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.338853 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee8391d-ed52-4e21-b9e9-b55b77407c85-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.343453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vjn\" (UniqueName: \"kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn\") pod \"ovnkube-node-gcj8q\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.385099 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.552374 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj"] Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.552782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.556655 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.561297 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.721612 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.724308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.724355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.724396 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcfxg\" (UniqueName: \"kubernetes.io/projected/98408bea-babc-486a-bfd2-aaf2067eae2f-kube-api-access-dcfxg\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.724425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.724442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98408bea-babc-486a-bfd2-aaf2067eae2f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.724549 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:28 crc kubenswrapper[4743]: E0122 13:46:28.724591 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs podName:ae9ad2e0-8b28-4352-8ced-7133f8b1c88d nodeName:}" failed. No retries permitted until 2026-01-22 13:46:29.724575163 +0000 UTC m=+26.279618326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs") pod "network-metrics-daemon-sljgz" (UID: "ae9ad2e0-8b28-4352-8ced-7133f8b1c88d") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.778240 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-22 13:41:27 +0000 UTC, rotation deadline is 2026-11-03 13:34:11.304069919 +0000 UTC Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.778324 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6839h47m42.525749503s for next certificate rotation Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.825775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98408bea-babc-486a-bfd2-aaf2067eae2f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.825893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.825939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcfxg\" (UniqueName: \"kubernetes.io/projected/98408bea-babc-486a-bfd2-aaf2067eae2f-kube-api-access-dcfxg\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.825993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.826578 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.826736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98408bea-babc-486a-bfd2-aaf2067eae2f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.830089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98408bea-babc-486a-bfd2-aaf2067eae2f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.831927 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.836150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb38e50b-7f75-4a79-9d18-673d43c87f84-serviceca\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.842311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcfxg\" (UniqueName: \"kubernetes.io/projected/98408bea-babc-486a-bfd2-aaf2067eae2f-kube-api-access-dcfxg\") pod \"ovnkube-control-plane-749d76644c-zq9qj\" (UID: \"98408bea-babc-486a-bfd2-aaf2067eae2f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.859008 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.908999 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.915379 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 13:46:28 crc kubenswrapper[4743]: W0122 13:46:28.920387 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98408bea_babc_486a_bfd2_aaf2067eae2f.slice/crio-823442615d3c859304e71af1394c84896cb5195dc4eee13d83ebe9289b5ccddd WatchSource:0}: Error finding container 823442615d3c859304e71af1394c84896cb5195dc4eee13d83ebe9289b5ccddd: Status 404 returned error can't find the container with id 823442615d3c859304e71af1394c84896cb5195dc4eee13d83ebe9289b5ccddd Jan 22 13:46:28 crc kubenswrapper[4743]: I0122 13:46:28.923378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cee8391d-ed52-4e21-b9e9-b55b77407c85-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sk9dw\" (UID: \"cee8391d-ed52-4e21-b9e9-b55b77407c85\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.059348 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.073398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbcr\" (UniqueName: \"kubernetes.io/projected/eb38e50b-7f75-4a79-9d18-673d43c87f84-kube-api-access-zfbcr\") pod \"node-ca-hp5b2\" (UID: \"eb38e50b-7f75-4a79-9d18-673d43c87f84\") " pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.080104 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp5b2" Jan 22 13:46:29 crc kubenswrapper[4743]: W0122 13:46:29.090710 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb38e50b_7f75_4a79_9d18_673d43c87f84.slice/crio-451a3134d1705968f8d8f5a8b8d93793ce13e9c3917383ddbd67489c8b8b9b26 WatchSource:0}: Error finding container 451a3134d1705968f8d8f5a8b8d93793ce13e9c3917383ddbd67489c8b8b9b26: Status 404 returned error can't find the container with id 451a3134d1705968f8d8f5a8b8d93793ce13e9c3917383ddbd67489c8b8b9b26 Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.150928 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.155322 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="571202e14924575a2748d0c1ce62ecce456e04d65c53c979766c580b4620ac7f" exitCode=0 Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.155375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"571202e14924575a2748d0c1ce62ecce456e04d65c53c979766c580b4620ac7f"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.155400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerStarted","Data":"80b1256564c2fc3eaa35bb636df833a52957315b9198850e23bbcdbf1961f693"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.157087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.160681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2kvgp" event={"ID":"5908c22b-d26e-4d00-a28c-07ae003f4ff2","Type":"ContainerStarted","Data":"eb6c216ace7f5aa1f009065941ae0e34a08d8c91d69396447b063f6950f77867"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.166673 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" exitCode=0 Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.166717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.166735 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"ad0595f9ab7c6ea02a7417522599a4911280efff04f57fbd2a351a5626af8010"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.168457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"9e61502183ec871f5e9f6160f663bfb0593ea74e0ae5148a23efc9f353685283"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.168500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.168509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"d9103a8306cce9e535711339fbf6b907c618782d872e9931d31fd9701bbc683e"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.191609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zbb8" event={"ID":"1a63ac85-9a00-4381-aa80-3da86d5483aa","Type":"ContainerStarted","Data":"7dd46882286eccccfdb5bf23792e79b22eeb5cdfb9ff66abb5c4990b365a1822"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.191650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zbb8" event={"ID":"1a63ac85-9a00-4381-aa80-3da86d5483aa","Type":"ContainerStarted","Data":"3f9c74ce2756a86de9baffba9a5daa9410ee80eefcc0d5fb1ad55bd44f5e9656"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.192459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp5b2" event={"ID":"eb38e50b-7f75-4a79-9d18-673d43c87f84","Type":"ContainerStarted","Data":"451a3134d1705968f8d8f5a8b8d93793ce13e9c3917383ddbd67489c8b8b9b26"} Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.193472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" event={"ID":"98408bea-babc-486a-bfd2-aaf2067eae2f","Type":"ContainerStarted","Data":"823442615d3c859304e71af1394c84896cb5195dc4eee13d83ebe9289b5ccddd"} Jan 22 13:46:29 crc kubenswrapper[4743]: W0122 13:46:29.196358 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee8391d_ed52_4e21_b9e9_b55b77407c85.slice/crio-1eeeaab6692c67a310f67dd1827e538e618dd0664825298b98cd7dfda1944d38 WatchSource:0}: Error finding container 1eeeaab6692c67a310f67dd1827e538e618dd0664825298b98cd7dfda1944d38: Status 404 returned error can't find the container with id 1eeeaab6692c67a310f67dd1827e538e618dd0664825298b98cd7dfda1944d38 Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.211640 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podStartSLOduration=2.211622785 podStartE2EDuration="2.211622785s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:29.20088066 +0000 UTC m=+25.755923823" watchObservedRunningTime="2026-01-22 13:46:29.211622785 +0000 UTC m=+25.766665938" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.239555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2kvgp" podStartSLOduration=2.2395371 podStartE2EDuration="2.2395371s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:29.212472548 +0000 UTC m=+25.767515721" watchObservedRunningTime="2026-01-22 13:46:29.2395371 +0000 UTC m=+25.794580263" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.252762 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4zbb8" podStartSLOduration=2.252743402 podStartE2EDuration="2.252743402s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:29.251239771 +0000 UTC m=+25.806282954" watchObservedRunningTime="2026-01-22 13:46:29.252743402 +0000 UTC m=+25.807786565" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.737430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.737775 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.737893 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs podName:ae9ad2e0-8b28-4352-8ced-7133f8b1c88d nodeName:}" failed. No retries permitted until 2026-01-22 13:46:31.737870511 +0000 UTC m=+28.292913734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs") pod "network-metrics-daemon-sljgz" (UID: "ae9ad2e0-8b28-4352-8ced-7133f8b1c88d") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.746998 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.746998 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.747106 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.747218 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.747250 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.747293 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.747386 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.747459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.838194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838301 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:37.838283344 +0000 UTC m=+34.393326507 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.838329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.838381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838484 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838486 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838498 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838506 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838508 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838516 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838547 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:37.838539871 +0000 UTC m=+34.393583034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.838559 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:37.838553571 +0000 UTC m=+34.393596734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.938843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:29 crc kubenswrapper[4743]: I0122 13:46:29.938886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.939035 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.939119 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:37.939096258 +0000 UTC m=+34.494139421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.939166 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:29 crc kubenswrapper[4743]: E0122 13:46:29.939218 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:37.939199191 +0000 UTC m=+34.494242424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.198091 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="ae3f83a518a9e0485afc3bd4cc525d818bb2c825cc9149608c0cab27bc4588ae" exitCode=0 Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.198169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"ae3f83a518a9e0485afc3bd4cc525d818bb2c825cc9149608c0cab27bc4588ae"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.200248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" event={"ID":"cee8391d-ed52-4e21-b9e9-b55b77407c85","Type":"ContainerStarted","Data":"9c3aedc3c2a988083b5905124a5029346d4cfa16441008adf2c8d7f87eb1a50b"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.200277 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" event={"ID":"cee8391d-ed52-4e21-b9e9-b55b77407c85","Type":"ContainerStarted","Data":"1eeeaab6692c67a310f67dd1827e538e618dd0664825298b98cd7dfda1944d38"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.203727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.205507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" event={"ID":"98408bea-babc-486a-bfd2-aaf2067eae2f","Type":"ContainerStarted","Data":"3da4edd1691e1c26443adb8389c8e14f2d07c3ccd34362804bd145c192d6f3d0"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.205546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" event={"ID":"98408bea-babc-486a-bfd2-aaf2067eae2f","Type":"ContainerStarted","Data":"3a399857d7fd1d81ed454c541705b2f6cba2e5827d5ca87523f36bb4f32701f3"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.207338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp5b2" event={"ID":"eb38e50b-7f75-4a79-9d18-673d43c87f84","Type":"ContainerStarted","Data":"605e4bbf46e81181425205bfc5384d2404213cbf4010c90ce2b2c739b368da6d"} Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.234932 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hp5b2" podStartSLOduration=3.234908207 podStartE2EDuration="3.234908207s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:30.234248889 +0000 UTC m=+26.789292062" watchObservedRunningTime="2026-01-22 13:46:30.234908207 +0000 UTC m=+26.789951400" Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.250491 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sk9dw" podStartSLOduration=3.250474354 podStartE2EDuration="3.250474354s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:30.249349653 +0000 UTC m=+26.804392816" watchObservedRunningTime="2026-01-22 13:46:30.250474354 +0000 UTC m=+26.805517517" Jan 22 13:46:30 crc kubenswrapper[4743]: I0122 13:46:30.265510 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zq9qj" podStartSLOduration=2.265488415 podStartE2EDuration="2.265488415s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:30.263699426 +0000 UTC m=+26.818742589" watchObservedRunningTime="2026-01-22 13:46:30.265488415 +0000 UTC m=+26.820531578" Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.212450 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="22ce7a8619fe45d73016667b2b40a95b424493a800c364ce8f5d9c5ee2bacc49" exitCode=0 Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.212519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"22ce7a8619fe45d73016667b2b40a95b424493a800c364ce8f5d9c5ee2bacc49"} Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.746505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.746584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.746594 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.746522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.746663 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.746707 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.746840 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.746921 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:31 crc kubenswrapper[4743]: I0122 13:46:31.758629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.758831 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:31 crc kubenswrapper[4743]: E0122 13:46:31.758909 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs podName:ae9ad2e0-8b28-4352-8ced-7133f8b1c88d nodeName:}" failed. No retries permitted until 2026-01-22 13:46:35.758889045 +0000 UTC m=+32.313932208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs") pod "network-metrics-daemon-sljgz" (UID: "ae9ad2e0-8b28-4352-8ced-7133f8b1c88d") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:32 crc kubenswrapper[4743]: I0122 13:46:32.218120 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="62ca247566d2ba58b86a42e6982ec864864747428f135098f91abd55aa6db4d7" exitCode=0 Jan 22 13:46:32 crc kubenswrapper[4743]: I0122 13:46:32.218160 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"62ca247566d2ba58b86a42e6982ec864864747428f135098f91abd55aa6db4d7"} Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.223673 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="be4e9edaa02351793d05a3d68ba58bae5a3be8e7420dd275ba8739464cf4e45f" exitCode=0 Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.223753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"be4e9edaa02351793d05a3d68ba58bae5a3be8e7420dd275ba8739464cf4e45f"} Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.227154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.597144 4743 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.747069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.747110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:33 crc kubenswrapper[4743]: E0122 13:46:33.747184 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.747195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:33 crc kubenswrapper[4743]: I0122 13:46:33.747075 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:33 crc kubenswrapper[4743]: E0122 13:46:33.747276 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:33 crc kubenswrapper[4743]: E0122 13:46:33.747893 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:33 crc kubenswrapper[4743]: E0122 13:46:33.753047 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:34 crc kubenswrapper[4743]: I0122 13:46:34.138751 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:46:34 crc kubenswrapper[4743]: I0122 13:46:34.232946 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a8cdd6d-befe-47b1-b26f-965fbc647be0" containerID="91ac2c0ec6c6f7bb791964db1eee7d9b02b840c5666e180b219f4118cb49719e" exitCode=0 Jan 22 13:46:34 crc kubenswrapper[4743]: I0122 13:46:34.232982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerDied","Data":"91ac2c0ec6c6f7bb791964db1eee7d9b02b840c5666e180b219f4118cb49719e"} Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.241778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" event={"ID":"7a8cdd6d-befe-47b1-b26f-965fbc647be0","Type":"ContainerStarted","Data":"37166bc16b9e9860cfd8005097718676a5ce096afe9d337c9fe0758ddc9f782d"} Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.747281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.747391 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.747467 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.747542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.747777 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.748607 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.748735 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.748894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:35 crc kubenswrapper[4743]: I0122 13:46:35.801392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.801554 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:35 crc kubenswrapper[4743]: E0122 13:46:35.801649 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs podName:ae9ad2e0-8b28-4352-8ced-7133f8b1c88d nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.801626852 +0000 UTC m=+40.356670015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs") pod "network-metrics-daemon-sljgz" (UID: "ae9ad2e0-8b28-4352-8ced-7133f8b1c88d") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.251009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerStarted","Data":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.251316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.251360 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.275959 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4ddzn" podStartSLOduration=9.275939874 podStartE2EDuration="9.275939874s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:36.275545364 +0000 UTC m=+32.830588537" watchObservedRunningTime="2026-01-22 13:46:36.275939874 +0000 UTC m=+32.830983047" Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.283269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.284296 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:36 crc kubenswrapper[4743]: I0122 13:46:36.301057 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podStartSLOduration=8.301030192 podStartE2EDuration="8.301030192s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:36.29986211 +0000 UTC m=+32.854905283" watchObservedRunningTime="2026-01-22 13:46:36.301030192 +0000 UTC m=+32.856073355" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.253275 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.342233 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sljgz"] Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.342392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.342504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.746734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.746741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.746826 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.746882 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.746953 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.747219 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.920459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.920650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:37 crc kubenswrapper[4743]: I0122 13:46:37.920699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.920881 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.920906 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.920925 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.920979 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:53.92096148 +0000 UTC m=+50.476004653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.921054 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:53.921012282 +0000 UTC m=+50.476055445 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.921064 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.921108 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.921123 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:37 crc kubenswrapper[4743]: E0122 13:46:37.921176 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:53.921168676 +0000 UTC m=+50.476211839 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 13:46:38 crc kubenswrapper[4743]: I0122 13:46:38.021687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:38 crc kubenswrapper[4743]: I0122 13:46:38.021754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:38 crc kubenswrapper[4743]: E0122 13:46:38.021848 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:38 crc kubenswrapper[4743]: E0122 13:46:38.021933 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:54.021912028 +0000 UTC m=+50.576955271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 13:46:38 crc kubenswrapper[4743]: E0122 13:46:38.021961 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:38 crc kubenswrapper[4743]: E0122 13:46:38.022035 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 13:46:54.02201339 +0000 UTC m=+50.577056623 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 13:46:38 crc kubenswrapper[4743]: I0122 13:46:38.256836 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:38 crc kubenswrapper[4743]: I0122 13:46:38.746929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:38 crc kubenswrapper[4743]: E0122 13:46:38.747149 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sljgz" podUID="ae9ad2e0-8b28-4352-8ced-7133f8b1c88d" Jan 22 13:46:39 crc kubenswrapper[4743]: I0122 13:46:39.747115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:39 crc kubenswrapper[4743]: I0122 13:46:39.747214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:39 crc kubenswrapper[4743]: I0122 13:46:39.747161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:39 crc kubenswrapper[4743]: E0122 13:46:39.747365 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 13:46:39 crc kubenswrapper[4743]: E0122 13:46:39.747636 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 13:46:39 crc kubenswrapper[4743]: E0122 13:46:39.747844 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.220516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.220834 4743 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.258875 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tff5x"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.259978 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.260076 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.260669 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.262329 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.262883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.263865 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.265039 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g2ptk"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.265593 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.265054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.268169 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.268440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.283829 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.284730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.294299 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.295196 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.308747 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309524 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309564 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309650 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309663 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309740 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309811 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.309745 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.310178 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.310608 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311062 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311183 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311240 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311291 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311412 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311470 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311571 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311704 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311837 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311965 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.312046 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.312198 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.312274 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.312514 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.311182 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.313140 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.313453 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.313612 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.313951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.314111 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.314516 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.314627 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.314929 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.315520 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.316132 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.317070 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.317542 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.317679 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.317780 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.317990 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.319621 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.319850 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.320951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.321581 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.321624 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hqfjq"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.322244 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.323165 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bghq9"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.323546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.324461 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.326982 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ksn26"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.327622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.328441 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nkmcr"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.329552 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.330962 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.331301 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.331592 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.331687 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332128 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332158 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332488 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332616 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332727 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.332749 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333145 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333162 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333184 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333292 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333426 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333292 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.333551 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.334062 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.334174 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.334266 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.334359 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.334857 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.336192 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.337315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.337963 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.338086 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.338209 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.338672 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.339547 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.339743 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.339752 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.341369 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.350270 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.351266 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.352093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.367885 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.383996 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.384369 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.384711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.384807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.385034 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.385121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.385159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.385177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2fmd\" (UniqueName: \"kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.385206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.387431 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.390716 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391035 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391386 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391469 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391517 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tff5x"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391649 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgc45"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391653 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.391536 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.392111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.392423 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.393090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.393223 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.393878 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.394046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.394501 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9lmv4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.394972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.395347 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-s9lg4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.395588 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.395692 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.396136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.396996 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.397599 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.404387 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.405026 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.405697 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.405843 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.406040 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9dz5n"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.406638 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.407133 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.407574 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.408339 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.408676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.410834 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.411345 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.411688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.412573 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.413310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.414159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.415439 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.416014 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.423561 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.424552 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.424932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.428508 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.429084 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.429275 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.429926 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.435693 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.436332 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88brd"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.436586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.436695 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.437119 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.437338 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.469090 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.478396 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.499120 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.499312 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.499756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500487 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshf5\" (UniqueName: \"kubernetes.io/projected/fd9ae554-93cb-478a-8d60-d701586b31b9-kube-api-access-rshf5\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9ts\" (UniqueName: \"kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx6fg\" (UniqueName: \"kubernetes.io/projected/b46225f4-dd80-45ae-9ffa-310527d770fc-kube-api-access-wx6fg\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd9ae554-93cb-478a-8d60-d701586b31b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b46225f4-dd80-45ae-9ffa-310527d770fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kr6\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-kube-api-access-v4kr6\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wkps\" (UniqueName: \"kubernetes.io/projected/9c60ce0a-ae39-43c2-a305-3c83391df7cc-kube-api-access-6wkps\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb3dbbd-43f2-49c7-9729-5acf3f400598-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-config\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtn7p\" (UniqueName: \"kubernetes.io/projected/7cb3dbbd-43f2-49c7-9729-5acf3f400598-kube-api-access-qtn7p\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9ae554-93cb-478a-8d60-d701586b31b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fd90cbe-98a9-450d-b9e8-83bca7304155-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-config\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2fmd\" (UniqueName: \"kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fd90cbe-98a9-450d-b9e8-83bca7304155-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmnl\" (UniqueName: \"kubernetes.io/projected/8134e970-25c9-4d3e-9cff-48a8b520b0da-kube-api-access-ppmnl\") pod \"downloads-7954f5f757-ksn26\" (UID: \"8134e970-25c9-4d3e-9cff-48a8b520b0da\") " pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500983 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-trusted-ca\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.500986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501014 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-images\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fcj9\" (UniqueName: \"kubernetes.io/projected/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-kube-api-access-7fcj9\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c60ce0a-ae39-43c2-a305-3c83391df7cc-serving-cert\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501294 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501782 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501834 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.501854 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.502260 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.502297 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.502442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.502992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.503469 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.503482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.504076 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.504245 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.505415 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.505665 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.506555 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.507197 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.507439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.508661 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.509345 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bghq9"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.509444 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.510072 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.510285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.511352 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9dz5n"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.513146 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.513455 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.516611 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.517524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.517545 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.517621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.519269 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.520425 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hhf5x"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.521142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.523152 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.523590 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.525693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nkmcr"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.526894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.528348 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.529725 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.530763 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgc45"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.532424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.535414 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x7rm8"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.536528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.537053 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nf29x"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.537970 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.538493 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xrxrn"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.539665 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.540032 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g2ptk"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.541250 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hqfjq"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.542566 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.542630 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.546020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.548090 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.550828 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.556971 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ksn26"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.558130 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nf29x"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.559020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.560124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.562103 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.562328 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.564268 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.565295 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h26n4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.565960 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.566354 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9lmv4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.567406 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88brd"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.568676 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.569754 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.570843 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.572504 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.574709 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7rm8"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.576189 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h26n4"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.577739 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.579630 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.581156 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs"] Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.582147 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602015 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c60ce0a-ae39-43c2-a305-3c83391df7cc-serving-cert\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcj9\" (UniqueName: \"kubernetes.io/projected/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-kube-api-access-7fcj9\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshf5\" (UniqueName: \"kubernetes.io/projected/fd9ae554-93cb-478a-8d60-d701586b31b9-kube-api-access-rshf5\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9ts\" (UniqueName: \"kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx6fg\" (UniqueName: \"kubernetes.io/projected/b46225f4-dd80-45ae-9ffa-310527d770fc-kube-api-access-wx6fg\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd9ae554-93cb-478a-8d60-d701586b31b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602373 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b46225f4-dd80-45ae-9ffa-310527d770fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kr6\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-kube-api-access-v4kr6\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wkps\" (UniqueName: \"kubernetes.io/projected/9c60ce0a-ae39-43c2-a305-3c83391df7cc-kube-api-access-6wkps\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb3dbbd-43f2-49c7-9729-5acf3f400598-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602489 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-config\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtn7p\" (UniqueName: \"kubernetes.io/projected/7cb3dbbd-43f2-49c7-9729-5acf3f400598-kube-api-access-qtn7p\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9ae554-93cb-478a-8d60-d701586b31b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fd90cbe-98a9-450d-b9e8-83bca7304155-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-config\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fd90cbe-98a9-450d-b9e8-83bca7304155-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmnl\" (UniqueName: \"kubernetes.io/projected/8134e970-25c9-4d3e-9cff-48a8b520b0da-kube-api-access-ppmnl\") pod \"downloads-7954f5f757-ksn26\" (UID: \"8134e970-25c9-4d3e-9cff-48a8b520b0da\") " pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-trusted-ca\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.603227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.603470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.603520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.603907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-config\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.604420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-config\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.604662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c60ce0a-ae39-43c2-a305-3c83391df7cc-trusted-ca\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.604672 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9ae554-93cb-478a-8d60-d701586b31b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.602719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-images\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.605020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.605048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.605109 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fd90cbe-98a9-450d-b9e8-83bca7304155-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.605286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b46225f4-dd80-45ae-9ffa-310527d770fc-images\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.606507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c60ce0a-ae39-43c2-a305-3c83391df7cc-serving-cert\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.607459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b46225f4-dd80-45ae-9ffa-310527d770fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.608098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fd90cbe-98a9-450d-b9e8-83bca7304155-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.608103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.608295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-serving-cert\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.608863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb3dbbd-43f2-49c7-9729-5acf3f400598-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.610145 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd9ae554-93cb-478a-8d60-d701586b31b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.623564 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.643200 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.662207 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.682603 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.702818 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.723202 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.743388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.746236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.762602 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.782543 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.803463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.823827 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.842385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.863188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.883658 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.911378 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.923349 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.943034 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.962150 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 13:46:40 crc kubenswrapper[4743]: I0122 13:46:40.982345 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.003043 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.022018 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.042509 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.063507 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.082601 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.103218 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.123610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.143219 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.163769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.183390 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.203646 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.224075 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.243590 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.264380 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.283168 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.303769 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.322600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.343700 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.363474 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.384013 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.403507 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.421190 4743 request.go:700] Waited for 1.006632466s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.422863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.443740 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.463768 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.483585 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.503859 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.523738 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.541749 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.583002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.603465 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-config\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617339 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617452 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-encryption-config\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.617739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-encryption-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.618129 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.118114883 +0000 UTC m=+38.673158046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-auth-proxy-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmnql\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-serving-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8be2cd1-1f91-434d-987e-96c980d05f50-serving-cert\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58p9q\" (UniqueName: \"kubernetes.io/projected/55ae1d5b-f700-4615-83ff-78263c4539d8-kube-api-access-58p9q\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618912 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-client\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.618997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564jp\" (UniqueName: \"kubernetes.io/projected/771e4a20-0bdf-4115-aadf-f46c3adfa53d-kube-api-access-564jp\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-policies\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-service-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9c6z\" (UniqueName: \"kubernetes.io/projected/c8be2cd1-1f91-434d-987e-96c980d05f50-kube-api-access-b9c6z\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s488v\" (UniqueName: \"kubernetes.io/projected/9da7e577-175f-451e-8ddc-fe17a70b2d2c-kube-api-access-s488v\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-image-import-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrnb\" (UniqueName: \"kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55ae1d5b-f700-4615-83ff-78263c4539d8-machine-approver-tls\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-serving-cert\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.619975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-node-pullsecrets\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-serving-cert\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-dir\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqlg\" (UniqueName: \"kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-client\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.620628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit-dir\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.623251 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.643258 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.663290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.684411 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.702676 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.721553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.721766 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.221733894 +0000 UTC m=+38.776777067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.721973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-client\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscx7\" (UniqueName: \"kubernetes.io/projected/5ae37707-9ec5-4019-b199-8d413fefc824-kube-api-access-dscx7\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-srv-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-config\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2305f385-0c21-405a-881e-e55438dae23f-signing-cabundle\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48p2\" (UniqueName: \"kubernetes.io/projected/9d181b51-5c53-4bee-8ebd-c2414d1e9394-kube-api-access-r48p2\") pod \"migrator-59844c95c7-lxmrq\" (UID: \"9d181b51-5c53-4bee-8ebd-c2414d1e9394\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-certs\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dfkc\" (UniqueName: \"kubernetes.io/projected/a68f253c-e45c-425a-9cdd-3e216ac4b87b-kube-api-access-7dfkc\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-config\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722622 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-node-bootstrap-token\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.722847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.723584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-encryption-config\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.723854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-encryption-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724070 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4dc1117-9346-474f-aa1f-43f390e34d2c-cert\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-service-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-auth-proxy-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.724477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr4tf\" (UniqueName: \"kubernetes.io/projected/0d3db073-fbeb-448b-9df8-20eeff19ef5b-kube-api-access-jr4tf\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.727333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.728235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-config\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.728564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.728562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-etcd-client\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.728650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-encryption-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.728956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f6504ab-14a9-4c81-b8ae-556b648168db-service-ca-bundle\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgdkk\" (UniqueName: \"kubernetes.io/projected/ae32dda8-ea09-47e9-ba96-702d8f0747ef-kube-api-access-hgdkk\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58p9q\" (UniqueName: \"kubernetes.io/projected/55ae1d5b-f700-4615-83ff-78263c4539d8-kube-api-access-58p9q\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-client\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a61563b-eef9-4282-b1a1-db7e128ef50b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-serving-cert\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729759 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v997z\" (UniqueName: \"kubernetes.io/projected/2305f385-0c21-405a-881e-e55438dae23f-kube-api-access-v997z\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729861 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-plugins-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.729947 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ae1d5b-f700-4615-83ff-78263c4539d8-auth-proxy-config\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-stats-auth\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-apiservice-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01597591-21de-4b1b-a719-c5417a0b5da0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79thb\" (UniqueName: \"kubernetes.io/projected/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-kube-api-access-79thb\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-images\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730640 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-socket-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-config\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730772 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s488v\" (UniqueName: \"kubernetes.io/projected/9da7e577-175f-451e-8ddc-fe17a70b2d2c-kube-api-access-s488v\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a68f253c-e45c-425a-9cdd-3e216ac4b87b-proxy-tls\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.730959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-image-import-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrnb\" (UniqueName: \"kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdgq\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-kube-api-access-hqdgq\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-encryption-config\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-node-pullsecrets\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae37707-9ec5-4019-b199-8d413fefc824-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-serving-cert\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-node-pullsecrets\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b7fa5f-d879-476a-830b-4775e00999a8-proxy-tls\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731451 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64bceb92-68bf-42c6-98e6-94c1eea9e122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731561 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5frn\" (UniqueName: \"kubernetes.io/projected/2f6504ab-14a9-4c81-b8ae-556b648168db-kube-api-access-h5frn\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9mr\" (UniqueName: \"kubernetes.io/projected/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-kube-api-access-kz9mr\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hccqs\" (UniqueName: \"kubernetes.io/projected/590063fd-38d4-4642-82cb-81e1c558ba31-kube-api-access-hccqs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0addfb32-dd21-4b73-8c78-75d5ac30c014-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d45h\" (UniqueName: \"kubernetes.io/projected/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-kube-api-access-6d45h\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.731956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-image-import-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkjj\" (UniqueName: \"kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqlg\" (UniqueName: \"kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxb9p\" (UniqueName: \"kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgc5\" (UniqueName: \"kubernetes.io/projected/5e3f78d9-5773-468b-8db7-592277480519-kube-api-access-9wgc5\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skrtz\" (UniqueName: \"kubernetes.io/projected/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-kube-api-access-skrtz\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2305f385-0c21-405a-881e-e55438dae23f-signing-key\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01597591-21de-4b1b-a719-c5417a0b5da0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732856 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-mountpoint-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732915 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.732976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-registration-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-csi-data-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733130 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpsr\" (UniqueName: \"kubernetes.io/projected/382bfe09-6fc6-4cf4-96ba-4861716caf3d-kube-api-access-tkpsr\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit-dir\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-config-volume\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit-dir\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-metrics-certs\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n89x\" (UniqueName: \"kubernetes.io/projected/64bceb92-68bf-42c6-98e6-94c1eea9e122-kube-api-access-8n89x\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmnql\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-serving-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.733775 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.233759833 +0000 UTC m=+38.788802996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.733955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8be2cd1-1f91-434d-987e-96c980d05f50-serving-cert\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-client\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-webhook-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9da7e577-175f-451e-8ddc-fe17a70b2d2c-serving-cert\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-metrics-tls\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-config\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564jp\" (UniqueName: \"kubernetes.io/projected/771e4a20-0bdf-4115-aadf-f46c3adfa53d-kube-api-access-564jp\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.734956 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-config\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-etcd-serving-ca\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-policies\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-serving-cert\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.735667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.736176 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.736277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.736384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.736997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-service-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a61563b-eef9-4282-b1a1-db7e128ef50b-config\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9c6z\" (UniqueName: \"kubernetes.io/projected/c8be2cd1-1f91-434d-987e-96c980d05f50-kube-api-access-b9c6z\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737587 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtsb\" (UniqueName: \"kubernetes.io/projected/b4dc1117-9346-474f-aa1f-43f390e34d2c-kube-api-access-ndtsb\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a61563b-eef9-4282-b1a1-db7e128ef50b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxk5g\" (UniqueName: \"kubernetes.io/projected/d7b7fa5f-d879-476a-830b-4775e00999a8-kube-api-access-gxk5g\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.737960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3f78d9-5773-468b-8db7-592277480519-metrics-tls\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-policies\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55ae1d5b-f700-4615-83ff-78263c4539d8-machine-approver-tls\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-serving-cert\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0addfb32-dd21-4b73-8c78-75d5ac30c014-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8be2cd1-1f91-434d-987e-96c980d05f50-service-ca-bundle\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-client\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.738736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jmp\" (UniqueName: \"kubernetes.io/projected/01597591-21de-4b1b-a719-c5417a0b5da0-kube-api-access-67jmp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.739190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.739205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzt8\" (UniqueName: \"kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.739590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-srv-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-dir\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/771e4a20-0bdf-4115-aadf-f46c3adfa53d-audit-dir\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrs6\" (UniqueName: \"kubernetes.io/projected/819be2f9-96db-4fda-9460-658323fcc772-kube-api-access-6wrs6\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8be2cd1-1f91-434d-987e-96c980d05f50-serving-cert\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b7fa5f-d879-476a-830b-4775e00999a8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbpx\" (UniqueName: \"kubernetes.io/projected/0addfb32-dd21-4b73-8c78-75d5ac30c014-kube-api-access-2dbpx\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/590063fd-38d4-4642-82cb-81e1c558ba31-tmpfs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.740947 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-default-certificate\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.741069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.741117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.741871 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.741966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.741996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.742348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.742861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9da7e577-175f-451e-8ddc-fe17a70b2d2c-audit\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.744923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.745150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55ae1d5b-f700-4615-83ff-78263c4539d8-machine-approver-tls\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.745150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.746177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.746773 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.746974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.746991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.748783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771e4a20-0bdf-4115-aadf-f46c3adfa53d-serving-cert\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.764209 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.803121 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.803121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2fmd\" (UniqueName: \"kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd\") pod \"console-f9d7485db-ln28w\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.823709 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.841963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.842226 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.342174335 +0000 UTC m=+38.897217498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2305f385-0c21-405a-881e-e55438dae23f-signing-key\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01597591-21de-4b1b-a719-c5417a0b5da0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpsr\" (UniqueName: \"kubernetes.io/projected/382bfe09-6fc6-4cf4-96ba-4861716caf3d-kube-api-access-tkpsr\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-mountpoint-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-registration-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842545 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-csi-data-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842594 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-config-volume\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-mountpoint-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-metrics-certs\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-csi-data-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.842927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n89x\" (UniqueName: \"kubernetes.io/projected/64bceb92-68bf-42c6-98e6-94c1eea9e122-kube-api-access-8n89x\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843064 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-webhook-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-metrics-tls\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.843189 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.343174553 +0000 UTC m=+38.898217716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843260 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-config\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-registration-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-serving-cert\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a61563b-eef9-4282-b1a1-db7e128ef50b-config\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843602 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtsb\" (UniqueName: \"kubernetes.io/projected/b4dc1117-9346-474f-aa1f-43f390e34d2c-kube-api-access-ndtsb\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a61563b-eef9-4282-b1a1-db7e128ef50b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxk5g\" (UniqueName: \"kubernetes.io/projected/d7b7fa5f-d879-476a-830b-4775e00999a8-kube-api-access-gxk5g\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3f78d9-5773-468b-8db7-592277480519-metrics-tls\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0addfb32-dd21-4b73-8c78-75d5ac30c014-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-client\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.843759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67jmp\" (UniqueName: \"kubernetes.io/projected/01597591-21de-4b1b-a719-c5417a0b5da0-kube-api-access-67jmp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzt8\" (UniqueName: \"kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-srv-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrs6\" (UniqueName: \"kubernetes.io/projected/819be2f9-96db-4fda-9460-658323fcc772-kube-api-access-6wrs6\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b7fa5f-d879-476a-830b-4775e00999a8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbpx\" (UniqueName: \"kubernetes.io/projected/0addfb32-dd21-4b73-8c78-75d5ac30c014-kube-api-access-2dbpx\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/590063fd-38d4-4642-82cb-81e1c558ba31-tmpfs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-default-certificate\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dscx7\" (UniqueName: \"kubernetes.io/projected/5ae37707-9ec5-4019-b199-8d413fefc824-kube-api-access-dscx7\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-srv-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-config\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dfkc\" (UniqueName: \"kubernetes.io/projected/a68f253c-e45c-425a-9cdd-3e216ac4b87b-kube-api-access-7dfkc\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.844986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2305f385-0c21-405a-881e-e55438dae23f-signing-cabundle\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.845014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48p2\" (UniqueName: \"kubernetes.io/projected/9d181b51-5c53-4bee-8ebd-c2414d1e9394-kube-api-access-r48p2\") pod \"migrator-59844c95c7-lxmrq\" (UID: \"9d181b51-5c53-4bee-8ebd-c2414d1e9394\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.845032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-certs\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.845055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-node-bootstrap-token\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.845083 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.846687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4dc1117-9346-474f-aa1f-43f390e34d2c-cert\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.846739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-service-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.846815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr4tf\" (UniqueName: \"kubernetes.io/projected/0d3db073-fbeb-448b-9df8-20eeff19ef5b-kube-api-access-jr4tf\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.846860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f6504ab-14a9-4c81-b8ae-556b648168db-service-ca-bundle\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.846886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgdkk\" (UniqueName: \"kubernetes.io/projected/ae32dda8-ea09-47e9-ba96-702d8f0747ef-kube-api-access-hgdkk\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.847845 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.848136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-config\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.848194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.849252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a61563b-eef9-4282-b1a1-db7e128ef50b-config\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.850132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-metrics-certs\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.850461 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.850925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.851403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-service-ca\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.851818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0addfb32-dd21-4b73-8c78-75d5ac30c014-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.852299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3f78d9-5773-468b-8db7-592277480519-metrics-tls\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.852821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/590063fd-38d4-4642-82cb-81e1c558ba31-tmpfs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.852839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2305f385-0c21-405a-881e-e55438dae23f-signing-key\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.852898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7b7fa5f-d879-476a-830b-4775e00999a8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/382bfe09-6fc6-4cf4-96ba-4861716caf3d-srv-cert\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853735 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-serving-cert\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a61563b-eef9-4282-b1a1-db7e128ef50b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v997z\" (UniqueName: \"kubernetes.io/projected/2305f385-0c21-405a-881e-e55438dae23f-kube-api-access-v997z\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.853995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-plugins-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-stats-auth\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854101 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-apiservice-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01597591-21de-4b1b-a719-c5417a0b5da0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79thb\" (UniqueName: \"kubernetes.io/projected/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-kube-api-access-79thb\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-images\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-socket-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-config\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a68f253c-e45c-425a-9cdd-3e216ac4b87b-proxy-tls\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdgq\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-kube-api-access-hqdgq\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01597591-21de-4b1b-a719-c5417a0b5da0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae37707-9ec5-4019-b199-8d413fefc824-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.854968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5frn\" (UniqueName: \"kubernetes.io/projected/2f6504ab-14a9-4c81-b8ae-556b648168db-kube-api-access-h5frn\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b7fa5f-d879-476a-830b-4775e00999a8-proxy-tls\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64bceb92-68bf-42c6-98e6-94c1eea9e122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9mr\" (UniqueName: \"kubernetes.io/projected/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-kube-api-access-kz9mr\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccqs\" (UniqueName: \"kubernetes.io/projected/590063fd-38d4-4642-82cb-81e1c558ba31-kube-api-access-hccqs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0addfb32-dd21-4b73-8c78-75d5ac30c014-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d45h\" (UniqueName: \"kubernetes.io/projected/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-kube-api-access-6d45h\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkjj\" (UniqueName: \"kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skrtz\" (UniqueName: \"kubernetes.io/projected/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-kube-api-access-skrtz\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxb9p\" (UniqueName: \"kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgc5\" (UniqueName: \"kubernetes.io/projected/5e3f78d9-5773-468b-8db7-592277480519-kube-api-access-9wgc5\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.855936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-plugins-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.856254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01597591-21de-4b1b-a719-c5417a0b5da0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.856338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-etcd-client\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.856503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.856619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/819be2f9-96db-4fda-9460-658323fcc772-socket-dir\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.856664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.857493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a68f253c-e45c-425a-9cdd-3e216ac4b87b-images\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.857607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2305f385-0c21-405a-881e-e55438dae23f-signing-cabundle\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.857627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.857895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.857900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-serving-cert\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.858444 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a61563b-eef9-4282-b1a1-db7e128ef50b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.858877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f6504ab-14a9-4c81-b8ae-556b648168db-service-ca-bundle\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.859583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64bceb92-68bf-42c6-98e6-94c1eea9e122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.859926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a68f253c-e45c-425a-9cdd-3e216ac4b87b-proxy-tls\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.860352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-default-certificate\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.862139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7b7fa5f-d879-476a-830b-4775e00999a8-proxy-tls\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.863017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.863658 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.863946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f6504ab-14a9-4c81-b8ae-556b648168db-stats-auth\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.864505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.865897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0addfb32-dd21-4b73-8c78-75d5ac30c014-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.890628 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.895433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.903778 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.923408 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.943158 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.956300 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.956705 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.456480089 +0000 UTC m=+39.011523422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.956775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:41 crc kubenswrapper[4743]: E0122 13:46:41.957117 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.457101776 +0000 UTC m=+39.012144939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.963643 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 13:46:41 crc kubenswrapper[4743]: I0122 13:46:41.983745 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.002815 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.007602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae32dda8-ea09-47e9-ba96-702d8f0747ef-srv-cert\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.024129 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.027658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.033080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.043051 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.058731 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.059100 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.559049781 +0000 UTC m=+39.114093014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.059719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.060643 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.560617254 +0000 UTC m=+39.115660457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.063571 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.067402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-webhook-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.069452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/590063fd-38d4-4642-82cb-81e1c558ba31-apiservice-cert\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.082785 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.092703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae37707-9ec5-4019-b199-8d413fefc824-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.103227 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.123128 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.133203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-node-bootstrap-token\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.142443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.151127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d3db073-fbeb-448b-9df8-20eeff19ef5b-certs\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.161362 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.161538 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.661499609 +0000 UTC m=+39.216542822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.162263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.163245 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.663213336 +0000 UTC m=+39.218256549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.163555 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.184005 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.202980 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.223249 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.243398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.263388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.263713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.263917 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.763898797 +0000 UTC m=+39.318941970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.264072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.264436 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.764424071 +0000 UTC m=+39.319467234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.283579 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.287944 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.303760 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.323680 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.343595 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.363752 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.365328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.365538 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.865490652 +0000 UTC m=+39.420533845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.366026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.366535 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.86651832 +0000 UTC m=+39.421561513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.376346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4dc1117-9346-474f-aa1f-43f390e34d2c-cert\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.407683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-serving-cert\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.407765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-metrics-tls\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.407767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-config\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.407948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-config\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.408198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-config-volume\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.408899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.414156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fcj9\" (UniqueName: \"kubernetes.io/projected/7a78c10f-79c4-417b-ae2b-3618d0f6c6cd-kube-api-access-7fcj9\") pod \"openshift-config-operator-7777fb866f-lpxk6\" (UID: \"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.421971 4743 request.go:700] Waited for 1.819038907s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.427580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx6fg\" (UniqueName: \"kubernetes.io/projected/b46225f4-dd80-45ae-9ffa-310527d770fc-kube-api-access-wx6fg\") pod \"machine-api-operator-5694c8668f-g2ptk\" (UID: \"b46225f4-dd80-45ae-9ffa-310527d770fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.433328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.440722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshf5\" (UniqueName: \"kubernetes.io/projected/fd9ae554-93cb-478a-8d60-d701586b31b9-kube-api-access-rshf5\") pod \"openshift-apiserver-operator-796bbdcf4f-fww5f\" (UID: \"fd9ae554-93cb-478a-8d60-d701586b31b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.464255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmnl\" (UniqueName: \"kubernetes.io/projected/8134e970-25c9-4d3e-9cff-48a8b520b0da-kube-api-access-ppmnl\") pod \"downloads-7954f5f757-ksn26\" (UID: \"8134e970-25c9-4d3e-9cff-48a8b520b0da\") " pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.469003 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.469173 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.969145493 +0000 UTC m=+39.524188676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.469692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.470193 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:42.970182182 +0000 UTC m=+39.525225365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.488082 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9ts\" (UniqueName: \"kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts\") pod \"controller-manager-879f6c89f-z5dt9\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.505142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.506230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wkps\" (UniqueName: \"kubernetes.io/projected/9c60ce0a-ae39-43c2-a305-3c83391df7cc-kube-api-access-6wkps\") pod \"console-operator-58897d9998-nkmcr\" (UID: \"9c60ce0a-ae39-43c2-a305-3c83391df7cc\") " pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.523726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kr6\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-kube-api-access-v4kr6\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.541250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtn7p\" (UniqueName: \"kubernetes.io/projected/7cb3dbbd-43f2-49c7-9729-5acf3f400598-kube-api-access-qtn7p\") pod \"cluster-samples-operator-665b6dd947-x7q7v\" (UID: \"7cb3dbbd-43f2-49c7-9729-5acf3f400598\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.559995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd90cbe-98a9-450d-b9e8-83bca7304155-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z8gz4\" (UID: \"5fd90cbe-98a9-450d-b9e8-83bca7304155\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.562529 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.570745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.570973 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.070940304 +0000 UTC m=+39.625983467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.571522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.571883 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.071871249 +0000 UTC m=+39.626914412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.584264 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.607576 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:46:42 crc kubenswrapper[4743]: W0122 13:46:42.618847 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda11f3169_f731_464a_a7d4_9dea61d28398.slice/crio-8213d799c7281777c4e0021c20e909025170d0aaf1aebfd52c5242a6b1b6c682 WatchSource:0}: Error finding container 8213d799c7281777c4e0021c20e909025170d0aaf1aebfd52c5242a6b1b6c682: Status 404 returned error can't find the container with id 8213d799c7281777c4e0021c20e909025170d0aaf1aebfd52c5242a6b1b6c682 Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.621311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58p9q\" (UniqueName: \"kubernetes.io/projected/55ae1d5b-f700-4615-83ff-78263c4539d8-kube-api-access-58p9q\") pod \"machine-approver-56656f9798-c65ml\" (UID: \"55ae1d5b-f700-4615-83ff-78263c4539d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.635031 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.637380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s488v\" (UniqueName: \"kubernetes.io/projected/9da7e577-175f-451e-8ddc-fe17a70b2d2c-kube-api-access-s488v\") pod \"apiserver-76f77b778f-tff5x\" (UID: \"9da7e577-175f-451e-8ddc-fe17a70b2d2c\") " pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.644150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-g2ptk"] Jan 22 13:46:42 crc kubenswrapper[4743]: W0122 13:46:42.653373 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb46225f4_dd80_45ae_9ffa_310527d770fc.slice/crio-0547992c104fa9ff2eb219cc7e32d8bfa102993709e2413d92595a9c01f7e01a WatchSource:0}: Error finding container 0547992c104fa9ff2eb219cc7e32d8bfa102993709e2413d92595a9c01f7e01a: Status 404 returned error can't find the container with id 0547992c104fa9ff2eb219cc7e32d8bfa102993709e2413d92595a9c01f7e01a Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.657185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.659088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrnb\" (UniqueName: \"kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb\") pod \"oauth-openshift-558db77b4-hqfjq\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.663710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.671439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.672904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.673091 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.173038743 +0000 UTC m=+39.728081906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.673435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.674149 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.174124523 +0000 UTC m=+39.729167866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.678702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.680884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.701245 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f"] Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.705167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqlg\" (UniqueName: \"kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg\") pod \"route-controller-manager-6576b87f9c-6j5s8\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.726250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmnql\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.726725 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.740453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564jp\" (UniqueName: \"kubernetes.io/projected/771e4a20-0bdf-4115-aadf-f46c3adfa53d-kube-api-access-564jp\") pod \"apiserver-7bbb656c7d-tdlw6\" (UID: \"771e4a20-0bdf-4115-aadf-f46c3adfa53d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.758861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9c6z\" (UniqueName: \"kubernetes.io/projected/c8be2cd1-1f91-434d-987e-96c980d05f50-kube-api-access-b9c6z\") pod \"authentication-operator-69f744f599-bghq9\" (UID: \"c8be2cd1-1f91-434d-987e-96c980d05f50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.763767 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.776760 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.777328 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.277228799 +0000 UTC m=+39.832271962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.777434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.778350 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.27834195 +0000 UTC m=+39.833385113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.785452 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.787269 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.823982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.825575 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.837565 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6"] Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.847874 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.869054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.880524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpsr\" (UniqueName: \"kubernetes.io/projected/382bfe09-6fc6-4cf4-96ba-4861716caf3d-kube-api-access-tkpsr\") pod \"catalog-operator-68c6474976-c9kq2\" (UID: \"382bfe09-6fc6-4cf4-96ba-4861716caf3d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.881702 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.882165 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.382140865 +0000 UTC m=+39.937184018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.882873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.883600 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.383578965 +0000 UTC m=+39.938622128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.901592 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n89x\" (UniqueName: \"kubernetes.io/projected/64bceb92-68bf-42c6-98e6-94c1eea9e122-kube-api-access-8n89x\") pod \"multus-admission-controller-857f4d67dd-9dz5n\" (UID: \"64bceb92-68bf-42c6-98e6-94c1eea9e122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.917519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtsb\" (UniqueName: \"kubernetes.io/projected/b4dc1117-9346-474f-aa1f-43f390e34d2c-kube-api-access-ndtsb\") pod \"ingress-canary-h26n4\" (UID: \"b4dc1117-9346-474f-aa1f-43f390e34d2c\") " pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.939720 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.954153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.957117 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscx7\" (UniqueName: \"kubernetes.io/projected/5ae37707-9ec5-4019-b199-8d413fefc824-kube-api-access-dscx7\") pod \"package-server-manager-789f6589d5-k57vs\" (UID: \"5ae37707-9ec5-4019-b199-8d413fefc824\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.961012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a61563b-eef9-4282-b1a1-db7e128ef50b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zcnlm\" (UID: \"1a61563b-eef9-4282-b1a1-db7e128ef50b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.980642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxk5g\" (UniqueName: \"kubernetes.io/projected/d7b7fa5f-d879-476a-830b-4775e00999a8-kube-api-access-gxk5g\") pod \"machine-config-controller-84d6567774-bxlcj\" (UID: \"d7b7fa5f-d879-476a-830b-4775e00999a8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.985551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:42 crc kubenswrapper[4743]: E0122 13:46:42.986194 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.486167047 +0000 UTC m=+40.041210210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.993649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:42 crc kubenswrapper[4743]: I0122 13:46:42.994286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.002029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgdkk\" (UniqueName: \"kubernetes.io/projected/ae32dda8-ea09-47e9-ba96-702d8f0747ef-kube-api-access-hgdkk\") pod \"olm-operator-6b444d44fb-pgxdf\" (UID: \"ae32dda8-ea09-47e9-ba96-702d8f0747ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.017665 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.018023 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.020181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr4tf\" (UniqueName: \"kubernetes.io/projected/0d3db073-fbeb-448b-9df8-20eeff19ef5b-kube-api-access-jr4tf\") pod \"machine-config-server-hhf5x\" (UID: \"0d3db073-fbeb-448b-9df8-20eeff19ef5b\") " pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.034525 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tff5x"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.035615 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.043330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.054481 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbpx\" (UniqueName: \"kubernetes.io/projected/0addfb32-dd21-4b73-8c78-75d5ac30c014-kube-api-access-2dbpx\") pod \"kube-storage-version-migrator-operator-b67b599dd-r6d62\" (UID: \"0addfb32-dd21-4b73-8c78-75d5ac30c014\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.057918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.064447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzt8\" (UniqueName: \"kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8\") pod \"collect-profiles-29484825-rkvnc\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.068328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.068842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.081403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7bdfa8a-4ff0-40c2-87d4-78670816efaa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kvlk2\" (UID: \"d7bdfa8a-4ff0-40c2-87d4-78670816efaa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.083178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.086338 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nkmcr"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.088346 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.088923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.090206 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.590183008 +0000 UTC m=+40.145226351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.099516 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67jmp\" (UniqueName: \"kubernetes.io/projected/01597591-21de-4b1b-a719-c5417a0b5da0-kube-api-access-67jmp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jltxm\" (UID: \"01597591-21de-4b1b-a719-c5417a0b5da0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.117210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.118452 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ksn26"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.119772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c973e8f6-4b20-40b2-ad5d-b2d95cc8354f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ljzlp\" (UID: \"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.123312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.138335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.138707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrs6\" (UniqueName: \"kubernetes.io/projected/819be2f9-96db-4fda-9460-658323fcc772-kube-api-access-6wrs6\") pod \"csi-hostpathplugin-nf29x\" (UID: \"819be2f9-96db-4fda-9460-658323fcc772\") " pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.147004 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.148172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hhf5x" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.163280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dfkc\" (UniqueName: \"kubernetes.io/projected/a68f253c-e45c-425a-9cdd-3e216ac4b87b-kube-api-access-7dfkc\") pod \"machine-config-operator-74547568cd-4qhbk\" (UID: \"a68f253c-e45c-425a-9cdd-3e216ac4b87b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.174009 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.180338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48p2\" (UniqueName: \"kubernetes.io/projected/9d181b51-5c53-4bee-8ebd-c2414d1e9394-kube-api-access-r48p2\") pod \"migrator-59844c95c7-lxmrq\" (UID: \"9d181b51-5c53-4bee-8ebd-c2414d1e9394\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.190057 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.190226 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.69019239 +0000 UTC m=+40.245235613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.190468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.191937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h26n4" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.193094 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.693078169 +0000 UTC m=+40.248121332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.199031 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v997z\" (UniqueName: \"kubernetes.io/projected/2305f385-0c21-405a-881e-e55438dae23f-kube-api-access-v997z\") pod \"service-ca-9c57cc56f-88brd\" (UID: \"2305f385-0c21-405a-881e-e55438dae23f\") " pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.213559 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.216654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5frn\" (UniqueName: \"kubernetes.io/projected/2f6504ab-14a9-4c81-b8ae-556b648168db-kube-api-access-h5frn\") pod \"router-default-5444994796-s9lg4\" (UID: \"2f6504ab-14a9-4c81-b8ae-556b648168db\") " pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.221822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.234174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hqfjq"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.235398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgc5\" (UniqueName: \"kubernetes.io/projected/5e3f78d9-5773-468b-8db7-592277480519-kube-api-access-9wgc5\") pod \"dns-operator-744455d44c-qgc45\" (UID: \"5e3f78d9-5773-468b-8db7-592277480519\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.251302 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bghq9"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.256983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d45h\" (UniqueName: \"kubernetes.io/projected/8a293f8d-adc5-4e55-a4a3-f729a768ab0d-kube-api-access-6d45h\") pod \"dns-default-x7rm8\" (UID: \"8a293f8d-adc5-4e55-a4a3-f729a768ab0d\") " pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.275554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79thb\" (UniqueName: \"kubernetes.io/projected/dfeb2496-20c0-4ec2-b289-bbc8bd8aa531-kube-api-access-79thb\") pod \"service-ca-operator-777779d784-nbqnt\" (UID: \"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.278865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ln28w" event={"ID":"a11f3169-f731-464a-a7d4-9dea61d28398","Type":"ContainerStarted","Data":"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.278903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ln28w" event={"ID":"a11f3169-f731-464a-a7d4-9dea61d28398","Type":"ContainerStarted","Data":"8213d799c7281777c4e0021c20e909025170d0aaf1aebfd52c5242a6b1b6c682"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.279809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" event={"ID":"9da7e577-175f-451e-8ddc-fe17a70b2d2c","Type":"ContainerStarted","Data":"9f97ecc498832d2298ef86609eea0cd18ea0418131bb468f891ca2dce8ecd575"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.281168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" event={"ID":"55ae1d5b-f700-4615-83ff-78263c4539d8","Type":"ContainerStarted","Data":"0eb615c2c4849ba8661f56eb2ce56477081f74eb9bd2b2d9db034a4b30d45bae"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.282250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" event={"ID":"fd9ae554-93cb-478a-8d60-d701586b31b9","Type":"ContainerStarted","Data":"700e660d9b0f3f8269147784d20c23484b498ae2d52a1552374f43eb9dd656c9"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.283161 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" event={"ID":"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd","Type":"ContainerStarted","Data":"0fd22dec608b8f7322961370a20be5eaf85d06126c6a94cc315dd7a87884429f"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.284911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.285252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" event={"ID":"b46225f4-dd80-45ae-9ffa-310527d770fc","Type":"ContainerStarted","Data":"b3618fd18a0073a0350b12a8385196013b6d0dac05d5fe17babb01449817f1b2"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.285285 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" event={"ID":"b46225f4-dd80-45ae-9ffa-310527d770fc","Type":"ContainerStarted","Data":"0547992c104fa9ff2eb219cc7e32d8bfa102993709e2413d92595a9c01f7e01a"} Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.291324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.291506 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.791486737 +0000 UTC m=+40.346529900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.291882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.292194 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.792184346 +0000 UTC m=+40.347227509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.301591 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd90cbe_98a9_450d_b9e8_83bca7304155.slice/crio-a8875e98b35a5ed755cc283b3683d9c1dfaaf5f9545a32cc76fe28a911598394 WatchSource:0}: Error finding container a8875e98b35a5ed755cc283b3683d9c1dfaaf5f9545a32cc76fe28a911598394: Status 404 returned error can't find the container with id a8875e98b35a5ed755cc283b3683d9c1dfaaf5f9545a32cc76fe28a911598394 Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.301774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkjj\" (UniqueName: \"kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj\") pod \"marketplace-operator-79b997595-fjjf5\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.305473 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.308445 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037eda14_3c3c_4b24_bb18_dea65e3e4548.slice/crio-60308cb3fa1a115371e63b1a33bd016d486bfabbd2247edbdbd5a88523bbbbe9 WatchSource:0}: Error finding container 60308cb3fa1a115371e63b1a33bd016d486bfabbd2247edbdbd5a88523bbbbe9: Status 404 returned error can't find the container with id 60308cb3fa1a115371e63b1a33bd016d486bfabbd2247edbdbd5a88523bbbbe9 Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.310052 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c60ce0a_ae39_43c2_a305_3c83391df7cc.slice/crio-fdd7be27c16137363b9fb676e377d3f52e822edcc3db4b7492b243960144d44c WatchSource:0}: Error finding container fdd7be27c16137363b9fb676e377d3f52e822edcc3db4b7492b243960144d44c: Status 404 returned error can't find the container with id fdd7be27c16137363b9fb676e377d3f52e822edcc3db4b7492b243960144d44c Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.312255 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8134e970_25c9_4d3e_9cff_48a8b520b0da.slice/crio-f2f26d991e993c175c1b3442bf64369edc7e58d5f0ae162e5233a7754b4db76d WatchSource:0}: Error finding container f2f26d991e993c175c1b3442bf64369edc7e58d5f0ae162e5233a7754b4db76d: Status 404 returned error can't find the container with id f2f26d991e993c175c1b3442bf64369edc7e58d5f0ae162e5233a7754b4db76d Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.316227 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771e4a20_0bdf_4115_aadf_f46c3adfa53d.slice/crio-5a02c0458447a965095c47234d2e7163a4f0d7dd2d6b65ebe20c243b72ef50b5 WatchSource:0}: Error finding container 5a02c0458447a965095c47234d2e7163a4f0d7dd2d6b65ebe20c243b72ef50b5: Status 404 returned error can't find the container with id 5a02c0458447a965095c47234d2e7163a4f0d7dd2d6b65ebe20c243b72ef50b5 Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.318581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skrtz\" (UniqueName: \"kubernetes.io/projected/3a5126ab-15b9-4b80-ab92-de1b1af3d4a7-kube-api-access-skrtz\") pod \"control-plane-machine-set-operator-78cbb6b69f-25vhb\" (UID: \"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.337254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.343016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9mr\" (UniqueName: \"kubernetes.io/projected/a4af8e42-cbb1-4fe9-b687-bb6456785f6e-kube-api-access-kz9mr\") pod \"etcd-operator-b45778765-9lmv4\" (UID: \"a4af8e42-cbb1-4fe9-b687-bb6456785f6e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.348387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.359254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.365650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.375021 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.378074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxb9p\" (UniqueName: \"kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p\") pod \"cni-sysctl-allowlist-ds-xrxrn\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.390673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.393857 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.394266 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.894243174 +0000 UTC m=+40.449286357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.397971 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-88brd" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.401012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccqs\" (UniqueName: \"kubernetes.io/projected/590063fd-38d4-4642-82cb-81e1c558ba31-kube-api-access-hccqs\") pod \"packageserver-d55dfcdfc-knnk2\" (UID: \"590063fd-38d4-4642-82cb-81e1c558ba31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.405977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.413784 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.427113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdgq\" (UniqueName: \"kubernetes.io/projected/645ac481-c9c8-4d4a-a7e4-2bc78dda109d-kube-api-access-hqdgq\") pod \"ingress-operator-5b745b69d9-5kvkh\" (UID: \"645ac481-c9c8-4d4a-a7e4-2bc78dda109d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.430851 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.454874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.484268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.495501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.495841 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:43.995828219 +0000 UTC m=+40.550871382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.528484 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.564254 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.596528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.596695 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.096666623 +0000 UTC m=+40.651709786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.597066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.597384 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.097375063 +0000 UTC m=+40.652418226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.597715 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.599190 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9dz5n"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.622623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h26n4"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.630484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.698019 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.698889 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.198831794 +0000 UTC m=+40.753874957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.699030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.699623 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.199614005 +0000 UTC m=+40.754657168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.717246 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a61563b_eef9_4282_b1a1_db7e128ef50b.slice/crio-3e2cbcb26b1e4926b1ca57a0f607c8d11341e428e4facc22c960b96122c5c507 WatchSource:0}: Error finding container 3e2cbcb26b1e4926b1ca57a0f607c8d11341e428e4facc22c960b96122c5c507: Status 404 returned error can't find the container with id 3e2cbcb26b1e4926b1ca57a0f607c8d11341e428e4facc22c960b96122c5c507 Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.770082 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64bceb92_68bf_42c6_98e6_94c1eea9e122.slice/crio-156cb64111a1af7cd69cfd029df5bbf06521656c4abbfbd5f409906a867eca7a WatchSource:0}: Error finding container 156cb64111a1af7cd69cfd029df5bbf06521656c4abbfbd5f409906a867eca7a: Status 404 returned error can't find the container with id 156cb64111a1af7cd69cfd029df5bbf06521656c4abbfbd5f409906a867eca7a Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.770579 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10cbe2ea_3b69_40a7_88a4_86e1f57dbab0.slice/crio-d27f4c357a548776262ff4d3d0cf4c3a00e29cae40acdc5d4ead360ac77d1702 WatchSource:0}: Error finding container d27f4c357a548776262ff4d3d0cf4c3a00e29cae40acdc5d4ead360ac77d1702: Status 404 returned error can't find the container with id d27f4c357a548776262ff4d3d0cf4c3a00e29cae40acdc5d4ead360ac77d1702 Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.804062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.804286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.805249 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.30520813 +0000 UTC m=+40.860251293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.816960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae9ad2e0-8b28-4352-8ced-7133f8b1c88d-metrics-certs\") pod \"network-metrics-daemon-sljgz\" (UID: \"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d\") " pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:43 crc kubenswrapper[4743]: W0122 13:46:43.863337 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3db073_fbeb_448b_9df8_20eeff19ef5b.slice/crio-5c0c13b8370d947575281e38d452c6f2ddf6bd2d83c8f16111bff0ab5eea3e6a WatchSource:0}: Error finding container 5c0c13b8370d947575281e38d452c6f2ddf6bd2d83c8f16111bff0ab5eea3e6a: Status 404 returned error can't find the container with id 5c0c13b8370d947575281e38d452c6f2ddf6bd2d83c8f16111bff0ab5eea3e6a Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.907710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:43 crc kubenswrapper[4743]: E0122 13:46:43.908100 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.408088601 +0000 UTC m=+40.963131764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.944442 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf"] Jan 22 13:46:43 crc kubenswrapper[4743]: I0122 13:46:43.995912 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgc45"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.009926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.011268 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.511237768 +0000 UTC m=+41.066280921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.041822 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.081099 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.097462 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sljgz" Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.112305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.112977 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.612935976 +0000 UTC m=+41.167979139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.190670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.213385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.213594 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.713540544 +0000 UTC m=+41.268583707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.214052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.214413 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.714399188 +0000 UTC m=+41.269442351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: W0122 13:46:44.226100 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd67e22a2_dc2b_4582_bfe0_7afff25995fb.slice/crio-ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663 WatchSource:0}: Error finding container ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663: Status 404 returned error can't find the container with id ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663 Jan 22 13:46:44 crc kubenswrapper[4743]: W0122 13:46:44.229068 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae37707_9ec5_4019_b199_8d413fefc824.slice/crio-99d004405f58f697fcb85bce27ca34c8bf2e19373d5ad5474fde686b5f585bec WatchSource:0}: Error finding container 99d004405f58f697fcb85bce27ca34c8bf2e19373d5ad5474fde686b5f585bec: Status 404 returned error can't find the container with id 99d004405f58f697fcb85bce27ca34c8bf2e19373d5ad5474fde686b5f585bec Jan 22 13:46:44 crc kubenswrapper[4743]: W0122 13:46:44.286864 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382bfe09_6fc6_4cf4_96ba_4861716caf3d.slice/crio-7143ab0ba31b65e18c61319a826b7fd4a010dc2cdf85cf17b1755e3263883247 WatchSource:0}: Error finding container 7143ab0ba31b65e18c61319a826b7fd4a010dc2cdf85cf17b1755e3263883247: Status 404 returned error can't find the container with id 7143ab0ba31b65e18c61319a826b7fd4a010dc2cdf85cf17b1755e3263883247 Jan 22 13:46:44 crc kubenswrapper[4743]: W0122 13:46:44.302854 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2887aee0_19a7_439f_8f40_ef40970ab796.slice/crio-d193afb91657622c0902df22ada1c7e81579dbb5b5712b862da4205d8a20eb66 WatchSource:0}: Error finding container d193afb91657622c0902df22ada1c7e81579dbb5b5712b862da4205d8a20eb66: Status 404 returned error can't find the container with id d193afb91657622c0902df22ada1c7e81579dbb5b5712b862da4205d8a20eb66 Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.309440 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a78c10f-79c4-417b-ae2b-3618d0f6c6cd" containerID="f44e9a9089adf8a37250a02d242dc00ccd09f76366d008af9946d581dacc94ba" exitCode=0 Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.309539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" event={"ID":"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd","Type":"ContainerDied","Data":"f44e9a9089adf8a37250a02d242dc00ccd09f76366d008af9946d581dacc94ba"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.316581 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.316691 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.816666691 +0000 UTC m=+41.371709854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.316941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.317295 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.817279198 +0000 UTC m=+41.372322361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.318898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ksn26" event={"ID":"8134e970-25c9-4d3e-9cff-48a8b520b0da","Type":"ContainerStarted","Data":"f2f26d991e993c175c1b3442bf64369edc7e58d5f0ae162e5233a7754b4db76d"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.323448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" event={"ID":"c8be2cd1-1f91-434d-987e-96c980d05f50","Type":"ContainerStarted","Data":"b54dcb04a4f5fceeab014dba6a3d606c23b2021e048d6a2ee4614185e5ca41fa"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.325208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" event={"ID":"5e3f78d9-5773-468b-8db7-592277480519","Type":"ContainerStarted","Data":"8310b6b0eb559c97e4fea2b8d7ffa2c2336e56dec4536799819b9d44fb1d1cb3"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.326289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" event={"ID":"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52","Type":"ContainerStarted","Data":"021bb78246b8b350bd61a0db115e9b49d160d027ac95e66b38cfce36eae942c3"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.327654 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" event={"ID":"55ae1d5b-f700-4615-83ff-78263c4539d8","Type":"ContainerStarted","Data":"3c38bc1a4198ed211e11d5710544459dec500732dcacbf8379522f5786472854"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.328729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" event={"ID":"5fd90cbe-98a9-450d-b9e8-83bca7304155","Type":"ContainerStarted","Data":"a8875e98b35a5ed755cc283b3683d9c1dfaaf5f9545a32cc76fe28a911598394"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.330531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" event={"ID":"1a61563b-eef9-4282-b1a1-db7e128ef50b","Type":"ContainerStarted","Data":"3e2cbcb26b1e4926b1ca57a0f607c8d11341e428e4facc22c960b96122c5c507"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.332407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" event={"ID":"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0","Type":"ContainerStarted","Data":"d27f4c357a548776262ff4d3d0cf4c3a00e29cae40acdc5d4ead360ac77d1702"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.335825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" event={"ID":"7cb3dbbd-43f2-49c7-9729-5acf3f400598","Type":"ContainerStarted","Data":"61691e4a7cc4511fa81cd7e1ba0c2938d898599cdae13f513b9a14069ce8aa66"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.345939 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s9lg4" event={"ID":"2f6504ab-14a9-4c81-b8ae-556b648168db","Type":"ContainerStarted","Data":"95781d7f207140f1fa9e0c2279be85bcd616df87b18b651f22ba543f845d8ee7"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.350176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" event={"ID":"fd9ae554-93cb-478a-8d60-d701586b31b9","Type":"ContainerStarted","Data":"40fe4610ed0de774116a2475dcdb57a66655a4518a1e67f623bd91e93e749066"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.371775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h26n4" event={"ID":"b4dc1117-9346-474f-aa1f-43f390e34d2c","Type":"ContainerStarted","Data":"cf87550e132daaa7fca079b9ef31412943162eb198531887a89301affd83ab39"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.382941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" event={"ID":"ae32dda8-ea09-47e9-ba96-702d8f0747ef","Type":"ContainerStarted","Data":"7e7062f910f118f19075cd45e49fee894430ba955449c2b064d6aef7866b5847"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.406647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" event={"ID":"64bceb92-68bf-42c6-98e6-94c1eea9e122","Type":"ContainerStarted","Data":"156cb64111a1af7cd69cfd029df5bbf06521656c4abbfbd5f409906a867eca7a"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.418348 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.419615 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:44.919591103 +0000 UTC m=+41.474634266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.422099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" event={"ID":"771e4a20-0bdf-4115-aadf-f46c3adfa53d","Type":"ContainerStarted","Data":"5a02c0458447a965095c47234d2e7163a4f0d7dd2d6b65ebe20c243b72ef50b5"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.428978 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nf29x"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.435690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" event={"ID":"037eda14-3c3c-4b24-bb18-dea65e3e4548","Type":"ContainerStarted","Data":"60308cb3fa1a115371e63b1a33bd016d486bfabbd2247edbdbd5a88523bbbbe9"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.436611 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.442707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" event={"ID":"9c60ce0a-ae39-43c2-a305-3c83391df7cc","Type":"ContainerStarted","Data":"fdd7be27c16137363b9fb676e377d3f52e822edcc3db4b7492b243960144d44c"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.442964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.449105 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.449445 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-nkmcr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.449485 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" podUID="9c60ce0a-ae39-43c2-a305-3c83391df7cc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.449601 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z5dt9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.449691 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.454294 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.460530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" event={"ID":"5ae37707-9ec5-4019-b199-8d413fefc824","Type":"ContainerStarted","Data":"99d004405f58f697fcb85bce27ca34c8bf2e19373d5ad5474fde686b5f585bec"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.466604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" event={"ID":"d67e22a2-dc2b-4582-bfe0-7afff25995fb","Type":"ContainerStarted","Data":"ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.475693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hhf5x" event={"ID":"0d3db073-fbeb-448b-9df8-20eeff19ef5b","Type":"ContainerStarted","Data":"5c0c13b8370d947575281e38d452c6f2ddf6bd2d83c8f16111bff0ab5eea3e6a"} Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.518364 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.526764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.532679 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.032650182 +0000 UTC m=+41.587693345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.567494 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.573448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.576244 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.632121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.634109 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.134091623 +0000 UTC m=+41.689134786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.735570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.736392 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.236374967 +0000 UTC m=+41.791418130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.776428 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.840147 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.840409 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.340378048 +0000 UTC m=+41.895421211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.842183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.842582 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.342573899 +0000 UTC m=+41.897617062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.943412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:44 crc kubenswrapper[4743]: E0122 13:46:44.944065 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.44404922 +0000 UTC m=+41.999092373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.979918 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.984260 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88brd"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.985233 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt"] Jan 22 13:46:44 crc kubenswrapper[4743]: I0122 13:46:44.999734 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ln28w" podStartSLOduration=16.999713466 podStartE2EDuration="16.999713466s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:44.998203735 +0000 UTC m=+41.553246898" watchObservedRunningTime="2026-01-22 13:46:44.999713466 +0000 UTC m=+41.554756629" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.045219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.045705 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.545687996 +0000 UTC m=+42.100731159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.048705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9lmv4"] Jan 22 13:46:45 crc kubenswrapper[4743]: W0122 13:46:45.072195 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfeb2496_20c0_4ec2_b289_bbc8bd8aa531.slice/crio-87bd3b5bc3cb7956e106cef61ea9b0125e67933943a1729795a1b80a0dcf1066 WatchSource:0}: Error finding container 87bd3b5bc3cb7956e106cef61ea9b0125e67933943a1729795a1b80a0dcf1066: Status 404 returned error can't find the container with id 87bd3b5bc3cb7956e106cef61ea9b0125e67933943a1729795a1b80a0dcf1066 Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.115121 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2"] Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.117897 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm"] Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.132147 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh"] Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.137978 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7rm8"] Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.150783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.151260 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.651244439 +0000 UTC m=+42.206287602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.229037 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sljgz"] Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.245820 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fww5f" podStartSLOduration=18.245800632 podStartE2EDuration="18.245800632s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.245168134 +0000 UTC m=+41.800211307" watchObservedRunningTime="2026-01-22 13:46:45.245800632 +0000 UTC m=+41.800843795" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.254496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.254899 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.754886401 +0000 UTC m=+42.309929564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: W0122 13:46:45.351461 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae9ad2e0_8b28_4352_8ced_7133f8b1c88d.slice/crio-a0dbcf3dcfecccd4b5058410e060b0368481ed30484ca89a04cc9974b1a97cbf WatchSource:0}: Error finding container a0dbcf3dcfecccd4b5058410e060b0368481ed30484ca89a04cc9974b1a97cbf: Status 404 returned error can't find the container with id a0dbcf3dcfecccd4b5058410e060b0368481ed30484ca89a04cc9974b1a97cbf Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.356669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.357080 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.857054961 +0000 UTC m=+42.412098124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.457920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.458550 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:45.958527933 +0000 UTC m=+42.513571086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.553057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" event={"ID":"b46225f4-dd80-45ae-9ffa-310527d770fc","Type":"ContainerStarted","Data":"0aca8687a37afa3025d71b9a3d87f405ea418dddce1101229cbaabb0eac57d7a"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.559268 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.559351 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.059334807 +0000 UTC m=+42.614377970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.559712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.560173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" event={"ID":"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531","Type":"ContainerStarted","Data":"87bd3b5bc3cb7956e106cef61ea9b0125e67933943a1729795a1b80a0dcf1066"} Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.560490 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.060475708 +0000 UTC m=+42.615518871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.566169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" event={"ID":"d7bdfa8a-4ff0-40c2-87d4-78670816efaa","Type":"ContainerStarted","Data":"61ac2b331b41bf68fcca8a7ffa8d7c3ce2de3700854c4be16dfceb4b83b9b5d9"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.575053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" event={"ID":"d7b7fa5f-d879-476a-830b-4775e00999a8","Type":"ContainerStarted","Data":"049a61b7a016d28a6b19d4abe2c155f0b330124ce720ebef965df525642c6ab1"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.575256 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" podStartSLOduration=17.575246063 podStartE2EDuration="17.575246063s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.573566937 +0000 UTC m=+42.128610100" watchObservedRunningTime="2026-01-22 13:46:45.575246063 +0000 UTC m=+42.130289226" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.584883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" event={"ID":"a68f253c-e45c-425a-9cdd-3e216ac4b87b","Type":"ContainerStarted","Data":"3857366f2420ef05defaa117b5ce9a898a3661829aff96e20cda8d77e56ac925"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.586439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" event={"ID":"2887aee0-19a7-439f-8f40-ef40970ab796","Type":"ContainerStarted","Data":"d193afb91657622c0902df22ada1c7e81579dbb5b5712b862da4205d8a20eb66"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.612933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h26n4" event={"ID":"b4dc1117-9346-474f-aa1f-43f390e34d2c","Type":"ContainerStarted","Data":"6122ea92523c93f0d493e11dafcda879bab276980a72012bf00fffa5eac46dff"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.615933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" event={"ID":"037eda14-3c3c-4b24-bb18-dea65e3e4548","Type":"ContainerStarted","Data":"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.621979 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" event={"ID":"819be2f9-96db-4fda-9460-658323fcc772","Type":"ContainerStarted","Data":"d438dcbdb282d9d1b946638cec4951e8e2f66d3e819b2d5e268025df29d53704"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.630014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" event={"ID":"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f","Type":"ContainerStarted","Data":"6b44809963d56efdc04c37d296344bf9fbf186ce3b851a438151bf2bba9972df"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.641843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" event={"ID":"c8be2cd1-1f91-434d-987e-96c980d05f50","Type":"ContainerStarted","Data":"a7df5f37dadfe09b885d76c0f3feb119a20467277c8eac25854e5e1ab53c0bc9"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.646591 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ksn26" event={"ID":"8134e970-25c9-4d3e-9cff-48a8b520b0da","Type":"ContainerStarted","Data":"05d2a3dc5a668c599da2c1765a5e4d0092a39946255a023f9f8289ec43ae2717"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.647404 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.654496 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-ksn26 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.654589 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ksn26" podUID="8134e970-25c9-4d3e-9cff-48a8b520b0da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.660575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.661030 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.161014004 +0000 UTC m=+42.716057167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.666672 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" podStartSLOduration=17.666657439 podStartE2EDuration="17.666657439s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.63533087 +0000 UTC m=+42.190374023" watchObservedRunningTime="2026-01-22 13:46:45.666657439 +0000 UTC m=+42.221700602" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.668244 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.698787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" event={"ID":"55ae1d5b-f700-4615-83ff-78263c4539d8","Type":"ContainerStarted","Data":"c2db566e7ce2de693a6a1242ba7e1e8beb6b8dc9a3a89688db906a79bd62a2be"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.719186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" event={"ID":"7cb3dbbd-43f2-49c7-9729-5acf3f400598","Type":"ContainerStarted","Data":"13f1da1755762efed17cff44a4f1025a5811beba63bde520e0c9c3df5ba3074b"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.720946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" event={"ID":"a4af8e42-cbb1-4fe9-b687-bb6456785f6e","Type":"ContainerStarted","Data":"d905c693ec4ad2708aca41cee9d76dd93e0d169e927b7b516528de0837511867"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.730125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" event={"ID":"01597591-21de-4b1b-a719-c5417a0b5da0","Type":"ContainerStarted","Data":"a04018b29930f222a63adefb5ea1313ed938e4637f158b266ff9e72ccab66698"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.742655 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-g2ptk" podStartSLOduration=17.742640992 podStartE2EDuration="17.742640992s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.741240194 +0000 UTC m=+42.296283357" watchObservedRunningTime="2026-01-22 13:46:45.742640992 +0000 UTC m=+42.297684155" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.744585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" event={"ID":"6f60519c-a85e-483e-ac46-8cde2dbbd166","Type":"ContainerStarted","Data":"f4959a17324562f8b08ccb0d6a28c333701263083efea53c73ae3c1fb1034b13"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.762103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.762661 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.26264534 +0000 UTC m=+42.817688503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.775902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-88brd" event={"ID":"2305f385-0c21-405a-881e-e55438dae23f","Type":"ContainerStarted","Data":"e83e8eedd7078664ee9d06ad8ded27f60e1308bb388defa48139065069969f62"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.775944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" event={"ID":"382bfe09-6fc6-4cf4-96ba-4861716caf3d","Type":"ContainerStarted","Data":"7143ab0ba31b65e18c61319a826b7fd4a010dc2cdf85cf17b1755e3263883247"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.798934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" event={"ID":"5fd90cbe-98a9-450d-b9e8-83bca7304155","Type":"ContainerStarted","Data":"c1c66d98e6af7b91d9763f9eb0f14cd307bba5c233cdf01391e6891ced10e89f"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.807221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hhf5x" event={"ID":"0d3db073-fbeb-448b-9df8-20eeff19ef5b","Type":"ContainerStarted","Data":"f070821854b87b09c8cd15df757fe307dce20ed3a499df53e3e3234cf1cc826f"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.810703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" event={"ID":"9d181b51-5c53-4bee-8ebd-c2414d1e9394","Type":"ContainerStarted","Data":"b86867a73b84e9f9860cb670d4041fcf9f298acabf6a02d5c8f1be6853873894"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.814988 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.816754 4743 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pgxdf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.816829 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" podUID="ae32dda8-ea09-47e9-ba96-702d8f0747ef" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.816843 4743 generic.go:334] "Generic (PLEG): container finished" podID="771e4a20-0bdf-4115-aadf-f46c3adfa53d" containerID="303cba45cd402b7cf45a6b7e66db210770bf9cdef73fbb6dc06cd4dc7a33f8ba" exitCode=0 Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.816916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" event={"ID":"771e4a20-0bdf-4115-aadf-f46c3adfa53d","Type":"ContainerDied","Data":"303cba45cd402b7cf45a6b7e66db210770bf9cdef73fbb6dc06cd4dc7a33f8ba"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.822240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7rm8" event={"ID":"8a293f8d-adc5-4e55-a4a3-f729a768ab0d","Type":"ContainerStarted","Data":"e495a631fbde68b7dc82d23d4f4040984114e4e97764435b9ad67cea48e9c354"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.823495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h26n4" podStartSLOduration=5.8234750779999995 podStartE2EDuration="5.823475078s" podCreationTimestamp="2026-01-22 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.819556941 +0000 UTC m=+42.374600104" watchObservedRunningTime="2026-01-22 13:46:45.823475078 +0000 UTC m=+42.378518241" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.823929 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ksn26" podStartSLOduration=17.82392411 podStartE2EDuration="17.82392411s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.797000602 +0000 UTC m=+42.352043785" watchObservedRunningTime="2026-01-22 13:46:45.82392411 +0000 UTC m=+42.378967273" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.852336 4743 generic.go:334] "Generic (PLEG): container finished" podID="9da7e577-175f-451e-8ddc-fe17a70b2d2c" containerID="14176ea9c6d5775973d70b5f2534f0814bfe24caf12294223b6808d2b6d06025" exitCode=0 Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.852418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" event={"ID":"9da7e577-175f-451e-8ddc-fe17a70b2d2c","Type":"ContainerDied","Data":"14176ea9c6d5775973d70b5f2534f0814bfe24caf12294223b6808d2b6d06025"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.866625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.867543 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.367528016 +0000 UTC m=+42.922571179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.873184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sljgz" event={"ID":"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d","Type":"ContainerStarted","Data":"a0dbcf3dcfecccd4b5058410e060b0368481ed30484ca89a04cc9974b1a97cbf"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.874693 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c65ml" podStartSLOduration=18.874673442 podStartE2EDuration="18.874673442s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.873438048 +0000 UTC m=+42.428481201" watchObservedRunningTime="2026-01-22 13:46:45.874673442 +0000 UTC m=+42.429716605" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.883876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" event={"ID":"590063fd-38d4-4642-82cb-81e1c558ba31","Type":"ContainerStarted","Data":"eec07975fa86d17ed34ac3d1a3293b20b5d44939112f2d546ccb4beb3b4d88b1"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.886845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" event={"ID":"0addfb32-dd21-4b73-8c78-75d5ac30c014","Type":"ContainerStarted","Data":"70b9b7593f3a5cd6d52dbb69041c7b3844cab6e6cc766ad4c0263e9ee1f3a129"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.891692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" event={"ID":"645ac481-c9c8-4d4a-a7e4-2bc78dda109d","Type":"ContainerStarted","Data":"d0e4ddd6288ee80e163d3ccd3db3f1e4f1b11006e767f92dfd74b215fce0b435"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.897425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s9lg4" event={"ID":"2f6504ab-14a9-4c81-b8ae-556b648168db","Type":"ContainerStarted","Data":"d7cd0293046fe5d4172763f40c5f07f8831b53ada7fc51ec538774a1b953d481"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.902414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" event={"ID":"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7","Type":"ContainerStarted","Data":"7edbfb1ede346141f030d1bf21ab1e75a6705e70a541104b45af608849c8d14d"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.916665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" event={"ID":"9c60ce0a-ae39-43c2-a305-3c83391df7cc","Type":"ContainerStarted","Data":"6557908f4cfa4fb3bcf09382893d51d5515cf2bf4e4cdbdbb5079b34569e07b6"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.947608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nkmcr" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.965607 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" event={"ID":"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52","Type":"ContainerStarted","Data":"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa"} Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.966162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.968255 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bghq9" podStartSLOduration=18.968232456 podStartE2EDuration="18.968232456s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:45.915010137 +0000 UTC m=+42.470053310" watchObservedRunningTime="2026-01-22 13:46:45.968232456 +0000 UTC m=+42.523275619" Jan 22 13:46:45 crc kubenswrapper[4743]: I0122 13:46:45.970005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:45 crc kubenswrapper[4743]: E0122 13:46:45.972139 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.472127843 +0000 UTC m=+43.027171006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.039474 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" podStartSLOduration=18.039457229 podStartE2EDuration="18.039457229s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.035360387 +0000 UTC m=+42.590403550" watchObservedRunningTime="2026-01-22 13:46:46.039457229 +0000 UTC m=+42.594500392" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.071545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.073157 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.573137112 +0000 UTC m=+43.128180275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.152345 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" podStartSLOduration=19.152322193 podStartE2EDuration="19.152322193s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.15184606 +0000 UTC m=+42.706889243" watchObservedRunningTime="2026-01-22 13:46:46.152322193 +0000 UTC m=+42.707365356" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.175410 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.189230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.189214 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z8gz4" podStartSLOduration=18.189197694 podStartE2EDuration="18.189197694s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.188186626 +0000 UTC m=+42.743229799" watchObservedRunningTime="2026-01-22 13:46:46.189197694 +0000 UTC m=+42.744240857" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.189542 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.689529223 +0000 UTC m=+43.244572386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.234771 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-s9lg4" podStartSLOduration=18.234758023 podStartE2EDuration="18.234758023s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.230775134 +0000 UTC m=+42.785818297" watchObservedRunningTime="2026-01-22 13:46:46.234758023 +0000 UTC m=+42.789801186" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.290442 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.290937 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.790919423 +0000 UTC m=+43.345962586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.308974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.312840 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:46 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:46 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:46 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.312901 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.318973 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hhf5x" podStartSLOduration=6.318957771 podStartE2EDuration="6.318957771s" podCreationTimestamp="2026-01-22 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.277117644 +0000 UTC m=+42.832160807" watchObservedRunningTime="2026-01-22 13:46:46.318957771 +0000 UTC m=+42.874000934" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.356376 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" podStartSLOduration=18.356359767 podStartE2EDuration="18.356359767s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:46.324836012 +0000 UTC m=+42.879879185" watchObservedRunningTime="2026-01-22 13:46:46.356359767 +0000 UTC m=+42.911402930" Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.392392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.392730 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.892718253 +0000 UTC m=+43.447761416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.493772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.493892 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.993869596 +0000 UTC m=+43.548912759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.495098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.495432 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:46.995419759 +0000 UTC m=+43.550462922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.600172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.600549 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.10053391 +0000 UTC m=+43.655577073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.626105 4743 csr.go:261] certificate signing request csr-ttqdd is approved, waiting to be issued Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.644245 4743 csr.go:257] certificate signing request csr-ttqdd is issued Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.713501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.713858 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.213846327 +0000 UTC m=+43.768889490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.817338 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.817515 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.317487828 +0000 UTC m=+43.872530991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.817581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.818014 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.318006232 +0000 UTC m=+43.873049395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.918582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.918691 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.418670112 +0000 UTC m=+43.973713275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:46 crc kubenswrapper[4743]: I0122 13:46:46.919095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:46 crc kubenswrapper[4743]: E0122 13:46:46.919428 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.419417262 +0000 UTC m=+43.974460415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.020524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.020977 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.520960776 +0000 UTC m=+44.076003939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.031370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" event={"ID":"6f60519c-a85e-483e-ac46-8cde2dbbd166","Type":"ContainerStarted","Data":"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.033368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.039617 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fjjf5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.039678 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.054411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" event={"ID":"a68f253c-e45c-425a-9cdd-3e216ac4b87b","Type":"ContainerStarted","Data":"07de11dc11ed125da2c32abf6aaba989488f0dd3169b64ea5e0e4ce7dac9c00e"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.059061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" event={"ID":"a4af8e42-cbb1-4fe9-b687-bb6456785f6e","Type":"ContainerStarted","Data":"bc96ce96120c87f605842c1d0a8c707efd30d846713c2d5cf28975b60a03f8cc"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.070414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" event={"ID":"3a5126ab-15b9-4b80-ab92-de1b1af3d4a7","Type":"ContainerStarted","Data":"1a70f818ed046d055e4ed216bbba999f83af384602a4d3759ecd20b4bb300fcc"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.089838 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" podStartSLOduration=19.089812744 podStartE2EDuration="19.089812744s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.088618131 +0000 UTC m=+43.643661294" watchObservedRunningTime="2026-01-22 13:46:47.089812744 +0000 UTC m=+43.644855907" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.096365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r6d62" event={"ID":"0addfb32-dd21-4b73-8c78-75d5ac30c014","Type":"ContainerStarted","Data":"429690f3671056900636a258374feca979eede1fa00af612342fd9e67a86a6a1"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.106080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" event={"ID":"01597591-21de-4b1b-a719-c5417a0b5da0","Type":"ContainerStarted","Data":"c5b22951c788c46170b70774fdd735e4286a0505ac08823f623a8f39e3a59071"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.117625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" event={"ID":"c973e8f6-4b20-40b2-ad5d-b2d95cc8354f","Type":"ContainerStarted","Data":"f0bab58b633317d99ef733882702a5d9a8238fc06360cd894c6b1010427a94c7"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.126560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.130183 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.63016464 +0000 UTC m=+44.185207853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.140234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" event={"ID":"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0","Type":"ContainerStarted","Data":"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.142737 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.198038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" event={"ID":"d7b7fa5f-d879-476a-830b-4775e00999a8","Type":"ContainerStarted","Data":"9591ced208184c5973128103abe6d4cab14b3765c12e25a27371994e730426dd"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.233961 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-25vhb" podStartSLOduration=19.233943405 podStartE2EDuration="19.233943405s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.174418203 +0000 UTC m=+43.729461366" watchObservedRunningTime="2026-01-22 13:46:47.233943405 +0000 UTC m=+43.788986568" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.239539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.240041 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.740024482 +0000 UTC m=+44.295067645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.276472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" event={"ID":"dfeb2496-20c0-4ec2-b289-bbc8bd8aa531","Type":"ContainerStarted","Data":"953e6f1283f295a53cd91fdefdc0f597b6941bc22295bfba0f818a1c4869adf3"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.292384 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" podStartSLOduration=19.292366247 podStartE2EDuration="19.292366247s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.237609285 +0000 UTC m=+43.792652448" watchObservedRunningTime="2026-01-22 13:46:47.292366247 +0000 UTC m=+43.847409410" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.322011 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:47 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:47 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:47 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.322067 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.325679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" event={"ID":"590063fd-38d4-4642-82cb-81e1c558ba31","Type":"ContainerStarted","Data":"29ba00733f2fca4a2b33b4fb901d5e7cb9d93f6a010ae97823a54b46b1c4aee0"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.327855 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.339961 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-knnk2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.348670 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" podUID="590063fd-38d4-4642-82cb-81e1c558ba31" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.348151 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9lmv4" podStartSLOduration=19.348134145 podStartE2EDuration="19.348134145s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.300545981 +0000 UTC m=+43.855589144" watchObservedRunningTime="2026-01-22 13:46:47.348134145 +0000 UTC m=+43.903177308" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.349227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.349477 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.849466242 +0000 UTC m=+44.404509405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.402514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" event={"ID":"645ac481-c9c8-4d4a-a7e4-2bc78dda109d","Type":"ContainerStarted","Data":"5e3a90be396cee0f3e9cfe651f69adb78daa8b85c6f260d6f5c7a5b7bbba5155"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.404667 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" podStartSLOduration=19.404654975 podStartE2EDuration="19.404654975s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.350667515 +0000 UTC m=+43.905710678" watchObservedRunningTime="2026-01-22 13:46:47.404654975 +0000 UTC m=+43.959698128" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.434899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" event={"ID":"d7bdfa8a-4ff0-40c2-87d4-78670816efaa","Type":"ContainerStarted","Data":"9c3bd8f93a50791cd7823605755fc68141c3a17ffbc61189492cc0e95fdc170f"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.445130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" event={"ID":"7a78c10f-79c4-417b-ae2b-3618d0f6c6cd","Type":"ContainerStarted","Data":"72ac156cbdcfffa78116d7ac234af9736d6bb33970c1a16f5e61daf460b389d3"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.445830 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.450483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.453547 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:47.953512694 +0000 UTC m=+44.508555857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.470210 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jltxm" podStartSLOduration=19.470168431 podStartE2EDuration="19.470168431s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.46941826 +0000 UTC m=+44.024461423" watchObservedRunningTime="2026-01-22 13:46:47.470168431 +0000 UTC m=+44.025211604" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.470611 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" podStartSLOduration=19.470604583 podStartE2EDuration="19.470604583s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.405640192 +0000 UTC m=+43.960683355" watchObservedRunningTime="2026-01-22 13:46:47.470604583 +0000 UTC m=+44.025647746" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.502623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" event={"ID":"d67e22a2-dc2b-4582-bfe0-7afff25995fb","Type":"ContainerStarted","Data":"a90b7e20a7ddf78fed89d0684b6fbdb934149fca9e3192dcb7886a107c6eeed1"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.527881 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nbqnt" podStartSLOduration=19.527845452 podStartE2EDuration="19.527845452s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.519928945 +0000 UTC m=+44.074972108" watchObservedRunningTime="2026-01-22 13:46:47.527845452 +0000 UTC m=+44.082888615" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.552718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.553954 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.053939467 +0000 UTC m=+44.608982630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.556048 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ljzlp" podStartSLOduration=19.556022134 podStartE2EDuration="19.556022134s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.552448786 +0000 UTC m=+44.107491949" watchObservedRunningTime="2026-01-22 13:46:47.556022134 +0000 UTC m=+44.111065297" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.558051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" event={"ID":"7cb3dbbd-43f2-49c7-9729-5acf3f400598","Type":"ContainerStarted","Data":"51635b8a8de846d8d2978e57b6d979e9833a14cc2be455a969214997cd838eec"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.564300 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.564401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" event={"ID":"2887aee0-19a7-439f-8f40-ef40970ab796","Type":"ContainerStarted","Data":"17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.604045 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" podStartSLOduration=19.604028 podStartE2EDuration="19.604028s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.602716884 +0000 UTC m=+44.157760047" watchObservedRunningTime="2026-01-22 13:46:47.604028 +0000 UTC m=+44.159071163" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.644895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" event={"ID":"ae32dda8-ea09-47e9-ba96-702d8f0747ef","Type":"ContainerStarted","Data":"4c5fb542bb6e1444b785ec56b79936301cec564981f648d04216147d08795a2e"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.649320 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-22 13:41:46 +0000 UTC, rotation deadline is 2026-11-03 13:19:22.29166566 +0000 UTC Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.649378 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6839h32m34.642295077s for next certificate rotation Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.649387 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" podStartSLOduration=19.649366933 podStartE2EDuration="19.649366933s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.649280001 +0000 UTC m=+44.204323164" watchObservedRunningTime="2026-01-22 13:46:47.649366933 +0000 UTC m=+44.204410096" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.653321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" event={"ID":"1a61563b-eef9-4282-b1a1-db7e128ef50b","Type":"ContainerStarted","Data":"24481aff43caeb28a156c99bd2c3b928d220a7da38257c2a672bbdc880c9a389"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.654706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.655865 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.155847001 +0000 UTC m=+44.710890154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.662956 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.666037 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" event={"ID":"5ae37707-9ec5-4019-b199-8d413fefc824","Type":"ContainerStarted","Data":"9feaad54e5a2e5840bbf333396e4594714272f78d1853840f54df1cf646b7393"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.666277 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.666368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.671198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-88brd" event={"ID":"2305f385-0c21-405a-881e-e55438dae23f","Type":"ContainerStarted","Data":"9bcdeba03ff73483ccf42f3a73d090630d85d7af4b9d75e83fa1ad20c9e1ea8c"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.673032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" event={"ID":"9d181b51-5c53-4bee-8ebd-c2414d1e9394","Type":"ContainerStarted","Data":"46d24eac8785e0d7791d270b7589901d312e0b8d306b004890d81ac072f39bab"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.686653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pgxdf" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.692042 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kvlk2" podStartSLOduration=19.692023243 podStartE2EDuration="19.692023243s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.689832743 +0000 UTC m=+44.244875906" watchObservedRunningTime="2026-01-22 13:46:47.692023243 +0000 UTC m=+44.247066406" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.700952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" event={"ID":"382bfe09-6fc6-4cf4-96ba-4861716caf3d","Type":"ContainerStarted","Data":"1f2c5f99e3f1c08f4bee783f34889343b29366f15c7474c5e4042c72d0c5fa53"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.702325 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.723699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" event={"ID":"5e3f78d9-5773-468b-8db7-592277480519","Type":"ContainerStarted","Data":"626872d69395fd183c6cc074110637cf1ae04699b866f075d19fa89a41497a84"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.737139 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" podStartSLOduration=19.737122319 podStartE2EDuration="19.737122319s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.734750334 +0000 UTC m=+44.289793497" watchObservedRunningTime="2026-01-22 13:46:47.737122319 +0000 UTC m=+44.292165482" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.756000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.758579 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.258567437 +0000 UTC m=+44.813610600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.784970 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.785024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" event={"ID":"64bceb92-68bf-42c6-98e6-94c1eea9e122","Type":"ContainerStarted","Data":"15f8b9fbf11323e9c1cc154449c7a5e493d82ca1c17ef624c868b9e4065a0d59"} Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.786506 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-ksn26 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.786548 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ksn26" podUID="8134e970-25c9-4d3e-9cff-48a8b520b0da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.820640 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" podStartSLOduration=19.820622748 podStartE2EDuration="19.820622748s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.807624002 +0000 UTC m=+44.362667165" watchObservedRunningTime="2026-01-22 13:46:47.820622748 +0000 UTC m=+44.375665911" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.820851 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" podStartSLOduration=20.820845174 podStartE2EDuration="20.820845174s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.773358412 +0000 UTC m=+44.328401575" watchObservedRunningTime="2026-01-22 13:46:47.820845174 +0000 UTC m=+44.375888337" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.865332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.865514 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.365487058 +0000 UTC m=+44.920530221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.866003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.870119 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.370099175 +0000 UTC m=+44.925142498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.924519 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x7q7v" podStartSLOduration=20.924481855 podStartE2EDuration="20.924481855s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.920965789 +0000 UTC m=+44.476008972" watchObservedRunningTime="2026-01-22 13:46:47.924481855 +0000 UTC m=+44.479525018" Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.968709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.977261 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.477231931 +0000 UTC m=+45.032275094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:47 crc kubenswrapper[4743]: I0122 13:46:47.980582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:47 crc kubenswrapper[4743]: E0122 13:46:47.981271 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.481245611 +0000 UTC m=+45.036288774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.046876 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-88brd" podStartSLOduration=20.04686267 podStartE2EDuration="20.04686267s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:47.992708326 +0000 UTC m=+44.547751489" watchObservedRunningTime="2026-01-22 13:46:48.04686267 +0000 UTC m=+44.601905833" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.081520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.081998 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.581982303 +0000 UTC m=+45.137025466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.183020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.183356 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.683343642 +0000 UTC m=+45.238386805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.232067 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zcnlm" podStartSLOduration=20.232033117 podStartE2EDuration="20.232033117s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.164464404 +0000 UTC m=+44.719507567" watchObservedRunningTime="2026-01-22 13:46:48.232033117 +0000 UTC m=+44.787076290" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.284683 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.284898 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.784870915 +0000 UTC m=+45.339914078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.285532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.285949 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.785937834 +0000 UTC m=+45.340980997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.310004 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:48 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:48 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:48 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.310614 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.322733 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" podStartSLOduration=20.322717863 podStartE2EDuration="20.322717863s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.321732776 +0000 UTC m=+44.876775949" watchObservedRunningTime="2026-01-22 13:46:48.322717863 +0000 UTC m=+44.877761026" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.323391 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" podStartSLOduration=20.323385551 podStartE2EDuration="20.323385551s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.236782227 +0000 UTC m=+44.791825390" watchObservedRunningTime="2026-01-22 13:46:48.323385551 +0000 UTC m=+44.878428714" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.386583 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.386720 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.886701057 +0000 UTC m=+45.441744220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.386894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.387190 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.88718307 +0000 UTC m=+45.442226233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.464380 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9kq2" podStartSLOduration=20.464358746 podStartE2EDuration="20.464358746s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.376005083 +0000 UTC m=+44.931048246" watchObservedRunningTime="2026-01-22 13:46:48.464358746 +0000 UTC m=+45.019401909" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.487919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.488116 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.988092086 +0000 UTC m=+45.543135249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.488161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.488450 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:48.988438036 +0000 UTC m=+45.543481199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.561603 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podStartSLOduration=8.561584121 podStartE2EDuration="8.561584121s" podCreationTimestamp="2026-01-22 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.485457704 +0000 UTC m=+45.040500867" watchObservedRunningTime="2026-01-22 13:46:48.561584121 +0000 UTC m=+45.116627284" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.588712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.588912 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.088882209 +0000 UTC m=+45.643925372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.589049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.589350 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.089342892 +0000 UTC m=+45.644386055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.651099 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" podStartSLOduration=20.651075383 podStartE2EDuration="20.651075383s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.575942055 +0000 UTC m=+45.130985218" watchObservedRunningTime="2026-01-22 13:46:48.651075383 +0000 UTC m=+45.206118546" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.652317 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lpxk6" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.691330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.691708 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.191693497 +0000 UTC m=+45.746736660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.750197 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xrxrn"] Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.794131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.794583 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.294559637 +0000 UTC m=+45.849602800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.800252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7rm8" event={"ID":"8a293f8d-adc5-4e55-a4a3-f729a768ab0d","Type":"ContainerStarted","Data":"882c56eec4f27fb4f2a8a3125cfcc62089ed384e6db6d3ab03961b9b7be9b445"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.800294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7rm8" event={"ID":"8a293f8d-adc5-4e55-a4a3-f729a768ab0d","Type":"ContainerStarted","Data":"e3e6863a804a0f4985cc0e531303982dc04dc6efe51a4f694367178778126bf0"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.800920 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.811146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bxlcj" event={"ID":"d7b7fa5f-d879-476a-830b-4775e00999a8","Type":"ContainerStarted","Data":"b4476d4afb567f927c6d7ffe3b1845f767cbaf71923db1282eb750ebf9d37b61"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.823074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lxmrq" event={"ID":"9d181b51-5c53-4bee-8ebd-c2414d1e9394","Type":"ContainerStarted","Data":"d93bd8521de91d0069e815ad0fa2d2015fb315f9917cf64e3f732d9112ea509d"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.852411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" event={"ID":"5ae37707-9ec5-4019-b199-8d413fefc824","Type":"ContainerStarted","Data":"30148ed4d983b9698a20f82da9c5bf4414da358d179d05ee356f1249997bccbc"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.866546 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x7rm8" podStartSLOduration=8.86652785 podStartE2EDuration="8.86652785s" podCreationTimestamp="2026-01-22 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.842485941 +0000 UTC m=+45.397529104" watchObservedRunningTime="2026-01-22 13:46:48.86652785 +0000 UTC m=+45.421571013" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.868579 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.869663 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.874711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9dz5n" event={"ID":"64bceb92-68bf-42c6-98e6-94c1eea9e122","Type":"ContainerStarted","Data":"13381582e4f397c47b1218843e951f6716f85a2a3bbda9eab7734ec371a2625b"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.877213 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.892395 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.895907 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:48 crc kubenswrapper[4743]: E0122 13:46:48.897397 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.397376655 +0000 UTC m=+45.952419818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.902703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" event={"ID":"771e4a20-0bdf-4115-aadf-f46c3adfa53d","Type":"ContainerStarted","Data":"f44ce8b15da8a75d4781797082ee910e358879a435b9121d9459dab9f1c7cd24"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.942951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" event={"ID":"9da7e577-175f-451e-8ddc-fe17a70b2d2c","Type":"ContainerStarted","Data":"b169a332cfc22794a766422f2d205dccf03e60c9352f43f8b41f3529b47c7349"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.942996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" event={"ID":"9da7e577-175f-451e-8ddc-fe17a70b2d2c","Type":"ContainerStarted","Data":"6efb094159d47d0ad5642cc748d867d35c08d6ce2ff178fea897ae0e8f340975"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.969523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sljgz" event={"ID":"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d","Type":"ContainerStarted","Data":"49ce423e69410fb208e37208b27293145e9a3f2b5065e65b0ac3565a754901bc"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.969570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sljgz" event={"ID":"ae9ad2e0-8b28-4352-8ced-7133f8b1c88d","Type":"ContainerStarted","Data":"fd74c2795277ecb3838736b81960b37166b6e11cebce3dea3135888774522095"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.969580 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" podStartSLOduration=20.969569744 podStartE2EDuration="20.969569744s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:48.969194444 +0000 UTC m=+45.524237627" watchObservedRunningTime="2026-01-22 13:46:48.969569744 +0000 UTC m=+45.524612907" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.992185 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" event={"ID":"819be2f9-96db-4fda-9460-658323fcc772","Type":"ContainerStarted","Data":"798c6d889319786087eaf8fc87db11c3204ec8d4cc09d8b20f259d02af45c474"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.992226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" event={"ID":"819be2f9-96db-4fda-9460-658323fcc772","Type":"ContainerStarted","Data":"59e534237cee9ec9dcac2bc09be25c344faa67f2f019ce790b53ebd8185f416d"} Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.999312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.999505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.999547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknd6\" (UniqueName: \"kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:48 crc kubenswrapper[4743]: I0122 13:46:48.999607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.000592 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.500579544 +0000 UTC m=+46.055622717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.002275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4qhbk" event={"ID":"a68f253c-e45c-425a-9cdd-3e216ac4b87b","Type":"ContainerStarted","Data":"bf0db5cac8acb06ebbb9f06779b0e4923b0a313d1dcdb05ef626dde8f0cb8977"} Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.018677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgc45" event={"ID":"5e3f78d9-5773-468b-8db7-592277480519","Type":"ContainerStarted","Data":"77758a4372a73115538e6b117d6236f5b194cd668e5e75283be73c03512894ce"} Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.034723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5kvkh" event={"ID":"645ac481-c9c8-4d4a-a7e4-2bc78dda109d","Type":"ContainerStarted","Data":"67679c253bc03ce3a414b7b3a59f57599e5218efa979152bddc5a2074a2ce698"} Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.035120 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" podStartSLOduration=22.035106301 podStartE2EDuration="22.035106301s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:49.027102381 +0000 UTC m=+45.582145554" watchObservedRunningTime="2026-01-22 13:46:49.035106301 +0000 UTC m=+45.590149464" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.040053 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-ksn26 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.049136 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ksn26" podUID="8134e970-25c9-4d3e-9cff-48a8b520b0da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.056880 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fjjf5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.056964 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.068130 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sljgz" podStartSLOduration=22.068107886 podStartE2EDuration="22.068107886s" podCreationTimestamp="2026-01-22 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:49.050829142 +0000 UTC m=+45.605872305" watchObservedRunningTime="2026-01-22 13:46:49.068107886 +0000 UTC m=+45.623151049" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.079731 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.095665 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.096310 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.098114 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.101040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.101250 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.601229494 +0000 UTC m=+46.156272657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.101450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.101668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.102293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.102342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknd6\" (UniqueName: \"kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.105920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.106717 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.606703124 +0000 UTC m=+46.161746287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.110179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.184279 4743 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.195189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknd6\" (UniqueName: \"kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6\") pod \"certified-operators-wv724\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.204355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.204387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.204613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.204665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.204707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92k2\" (UniqueName: \"kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.204904 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.704876515 +0000 UTC m=+46.259919678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.246136 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.247134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.267503 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lt8d\" (UniqueName: \"kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.308701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92k2\" (UniqueName: \"kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.310122 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.81010699 +0000 UTC m=+46.365150153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.310661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.310910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.322382 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:49 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:49 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:49 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.323027 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.371738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92k2\" (UniqueName: \"kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2\") pod \"community-operators-bpk8r\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.409941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.410232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lt8d\" (UniqueName: \"kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.410289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.410341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.410492 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:49.910473281 +0000 UTC m=+46.465516444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.411197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.411220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.435937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.440980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lt8d\" (UniqueName: \"kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d\") pod \"certified-operators-7mpfs\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.447876 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.449129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.499019 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knnk2" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.520458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.520507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvnc\" (UniqueName: \"kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.520537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.520557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.520904 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.020891408 +0000 UTC m=+46.575934571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.564755 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.611093 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.621254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.621422 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.121394484 +0000 UTC m=+46.676437647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.621865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvnc\" (UniqueName: \"kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.621892 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.621914 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.621999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.622489 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.122477343 +0000 UTC m=+46.677520566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.622714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.622982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.650396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvnc\" (UniqueName: \"kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc\") pod \"community-operators-fgj9j\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.726266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.726490 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.226465694 +0000 UTC m=+46.781508857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.726582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.726883 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.226872035 +0000 UTC m=+46.781915198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.778066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.829624 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.829941 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.329926 +0000 UTC m=+46.884969163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.903911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.918367 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:46:49 crc kubenswrapper[4743]: I0122 13:46:49.931095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:49 crc kubenswrapper[4743]: E0122 13:46:49.931469 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.431457523 +0000 UTC m=+46.986500686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.035807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:50 crc kubenswrapper[4743]: E0122 13:46:50.036287 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.536246366 +0000 UTC m=+47.091289529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.036554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:50 crc kubenswrapper[4743]: E0122 13:46:50.036990 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 13:46:50.536974166 +0000 UTC m=+47.092017329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-94xwn" (UID: "7b22abd3-ecba-46ba-a310-99000f911356") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.061322 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.086447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerStarted","Data":"22199ecb6471a38e9629e3062982ae9e97c1e964f6990a137bfde127b6a9fd25"} Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.094577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerStarted","Data":"a56a2cd6a0de824ee5728ab9e57416e1650607ebc5ac41bd345c491957ad1273"} Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.097107 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" gracePeriod=30 Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.107404 4743 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-22T13:46:49.184313731Z","Handler":null,"Name":""} Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.110327 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:46:50 crc kubenswrapper[4743]: W0122 13:46:50.114217 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0eaca2f_a489_4a21_9bfc_4a8fc02ccc85.slice/crio-3e53380e4d4ab595b37cf336cd056fcc1e02c6c83a56dbd855893f57923a3c35 WatchSource:0}: Error finding container 3e53380e4d4ab595b37cf336cd056fcc1e02c6c83a56dbd855893f57923a3c35: Status 404 returned error can't find the container with id 3e53380e4d4ab595b37cf336cd056fcc1e02c6c83a56dbd855893f57923a3c35 Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.127542 4743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.127598 4743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.130853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.145250 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.185020 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.247892 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.305546 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.305591 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.315994 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:50 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:50 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:50 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.316054 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.501704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-94xwn\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.778693 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.823021 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.824166 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.833272 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.844621 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.961007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzwx\" (UniqueName: \"kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.961194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:50 crc kubenswrapper[4743]: I0122 13:46:50.961245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.033171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:46:51 crc kubenswrapper[4743]: W0122 13:46:51.042931 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b22abd3_ecba_46ba_a310_99000f911356.slice/crio-77e7aca735bf6c17be932b405758d626af361bb02029d1c82fd258866e138d69 WatchSource:0}: Error finding container 77e7aca735bf6c17be932b405758d626af361bb02029d1c82fd258866e138d69: Status 404 returned error can't find the container with id 77e7aca735bf6c17be932b405758d626af361bb02029d1c82fd258866e138d69 Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.070180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzwx\" (UniqueName: \"kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.070482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.070553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.071550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.072209 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.092581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzwx\" (UniqueName: \"kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx\") pod \"redhat-marketplace-m8dtj\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.106397 4743 generic.go:334] "Generic (PLEG): container finished" podID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerID="cf1b9e01a26bb95cce31ba7b86c4f3407c4d86e19d14ad64a621e67713ec754f" exitCode=0 Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.106502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerDied","Data":"cf1b9e01a26bb95cce31ba7b86c4f3407c4d86e19d14ad64a621e67713ec754f"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.106535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerStarted","Data":"3e53380e4d4ab595b37cf336cd056fcc1e02c6c83a56dbd855893f57923a3c35"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.110537 4743 generic.go:334] "Generic (PLEG): container finished" podID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerID="b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2" exitCode=0 Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.110588 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.110602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerDied","Data":"b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.113511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" event={"ID":"7b22abd3-ecba-46ba-a310-99000f911356","Type":"ContainerStarted","Data":"77e7aca735bf6c17be932b405758d626af361bb02029d1c82fd258866e138d69"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.116030 4743 generic.go:334] "Generic (PLEG): container finished" podID="a76ec049-d99a-40be-9fec-f76370769aea" containerID="769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066" exitCode=0 Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.116089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerDied","Data":"769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.120947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" event={"ID":"819be2f9-96db-4fda-9460-658323fcc772","Type":"ContainerStarted","Data":"f1eb787dcf9c2a11467e953d282db40def4a3ab9b9a86f3a31563700e52cc18e"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.121006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" event={"ID":"819be2f9-96db-4fda-9460-658323fcc772","Type":"ContainerStarted","Data":"65f669bec838153eb1b52ffd2241451760cc30f0e4fb31df7322fd581e85d931"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.123631 4743 generic.go:334] "Generic (PLEG): container finished" podID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerID="ab704eeb5abde2c5b147226cafaa7789231aefddc49b52f60cb1703c2863b1e3" exitCode=0 Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.124778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerDied","Data":"ab704eeb5abde2c5b147226cafaa7789231aefddc49b52f60cb1703c2863b1e3"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.124869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerStarted","Data":"555009e9bbfaa517e62321b02dfeaaab7ffc3233a60e2159e582c465bc86c339"} Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.142779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.162490 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nf29x" podStartSLOduration=11.162469741 podStartE2EDuration="11.162469741s" podCreationTimestamp="2026-01-22 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:51.14239839 +0000 UTC m=+47.697441553" watchObservedRunningTime="2026-01-22 13:46:51.162469741 +0000 UTC m=+47.717512904" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.223282 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.224440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.234723 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.309100 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:51 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:51 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:51 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.309204 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.385647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.387993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x2kq\" (UniqueName: \"kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.388042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.388062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.489185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x2kq\" (UniqueName: \"kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.489534 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.489557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.490333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.490406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.511389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x2kq\" (UniqueName: \"kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq\") pod \"redhat-marketplace-nmxmh\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.554203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.768159 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 22 13:46:51 crc kubenswrapper[4743]: I0122 13:46:51.768747 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.027862 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.028249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.033982 4743 patch_prober.go:28] interesting pod/console-f9d7485db-ln28w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.034032 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ln28w" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.131632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" event={"ID":"7b22abd3-ecba-46ba-a310-99000f911356","Type":"ContainerStarted","Data":"dde2c30ede4b010f13053167044a68e55dea8446738f97179775ef14da947294"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.131935 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.134779 4743 generic.go:334] "Generic (PLEG): container finished" podID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerID="9d89230033d8c7d055538f44ab929830e5c24ef05d9bff63616ad8f084b57170" exitCode=0 Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.134843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerDied","Data":"9d89230033d8c7d055538f44ab929830e5c24ef05d9bff63616ad8f084b57170"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.134894 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerStarted","Data":"f4c6a9cd1fda0fa4b4221488c2c34eb006edacaa2ea5342870784f17111aaa71"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.139849 4743 generic.go:334] "Generic (PLEG): container finished" podID="d67e22a2-dc2b-4582-bfe0-7afff25995fb" containerID="a90b7e20a7ddf78fed89d0684b6fbdb934149fca9e3192dcb7886a107c6eeed1" exitCode=0 Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.139950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" event={"ID":"d67e22a2-dc2b-4582-bfe0-7afff25995fb","Type":"ContainerDied","Data":"a90b7e20a7ddf78fed89d0684b6fbdb934149fca9e3192dcb7886a107c6eeed1"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.150879 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b61be23-4db1-4316-a840-1aaff04b664e" containerID="708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711" exitCode=0 Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.151905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerDied","Data":"708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.151945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerStarted","Data":"cf6613bc3f279ff6dc9247da3dd2e62d9118eecc1be3eca21c67014d997f8bf1"} Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.182440 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" podStartSLOduration=24.182416641 podStartE2EDuration="24.182416641s" podCreationTimestamp="2026-01-22 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:52.165381124 +0000 UTC m=+48.720424287" watchObservedRunningTime="2026-01-22 13:46:52.182416641 +0000 UTC m=+48.737459814" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.224893 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.240629 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.240777 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.244194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.309056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.309146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvzf\" (UniqueName: \"kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.309229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.310421 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:52 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:52 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:52 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.310505 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.352490 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.353350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.355565 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.361084 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.361860 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.421172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.421269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zvzf\" (UniqueName: \"kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.421306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.421758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.422047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.444994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zvzf\" (UniqueName: \"kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf\") pod \"redhat-operators-fqtsl\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.522218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.522270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.564352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.618552 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.620857 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.623113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.623266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.623377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.628237 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.650545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.658649 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-ksn26 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.658666 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-ksn26 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.658732 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ksn26" podUID="8134e970-25c9-4d3e-9cff-48a8b520b0da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.658779 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ksn26" podUID="8134e970-25c9-4d3e-9cff-48a8b520b0da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.681121 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.682289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.688361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.725297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8zp\" (UniqueName: \"kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.725370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.725449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.729533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.826865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.827825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.829383 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8zp\" (UniqueName: \"kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.829423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.829882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.847154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8zp\" (UniqueName: \"kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp\") pod \"redhat-operators-7bp9h\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.870969 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.871030 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.879708 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:52 crc kubenswrapper[4743]: I0122 13:46:52.963394 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.027054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:46:53 crc kubenswrapper[4743]: W0122 13:46:53.053291 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e6768a_f15d_4daf_9e12_950e7d3b9552.slice/crio-e47982e7a73e6cd216e87f7bb30bb0d5ebb62086988813f245c3c09be4f625bb WatchSource:0}: Error finding container e47982e7a73e6cd216e87f7bb30bb0d5ebb62086988813f245c3c09be4f625bb: Status 404 returned error can't find the container with id e47982e7a73e6cd216e87f7bb30bb0d5ebb62086988813f245c3c09be4f625bb Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.057023 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.175025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d048d869-32a2-451a-8623-278778ae772d","Type":"ContainerStarted","Data":"997e5e29106c129ec48f52da5877b77ca78e5032371e6ae5e8852575293fe2dd"} Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.177266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerStarted","Data":"e47982e7a73e6cd216e87f7bb30bb0d5ebb62086988813f245c3c09be4f625bb"} Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.183472 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tdlw6" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.184956 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tff5x" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.307560 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.311275 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:53 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:53 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:53 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.311317 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.482162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:46:53 crc kubenswrapper[4743]: E0122 13:46:53.504031 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:46:53 crc kubenswrapper[4743]: E0122 13:46:53.509859 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:46:53 crc kubenswrapper[4743]: E0122 13:46:53.538965 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:46:53 crc kubenswrapper[4743]: E0122 13:46:53.539037 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:46:53 crc kubenswrapper[4743]: W0122 13:46:53.564958 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2a0da8_a3d8_4f70_a0a8_d2f1150c40ce.slice/crio-85fd1a6a478c59defe07b39c2f08a24b3643f5f8a46d83cc45995b7edbc3e6c0 WatchSource:0}: Error finding container 85fd1a6a478c59defe07b39c2f08a24b3643f5f8a46d83cc45995b7edbc3e6c0: Status 404 returned error can't find the container with id 85fd1a6a478c59defe07b39c2f08a24b3643f5f8a46d83cc45995b7edbc3e6c0 Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.805375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.968998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume\") pod \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.969352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgzt8\" (UniqueName: \"kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8\") pod \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.969413 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume\") pod \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\" (UID: \"d67e22a2-dc2b-4582-bfe0-7afff25995fb\") " Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.969582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.969637 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.969859 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "d67e22a2-dc2b-4582-bfe0-7afff25995fb" (UID: "d67e22a2-dc2b-4582-bfe0-7afff25995fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.976142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.976557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8" (OuterVolumeSpecName: "kube-api-access-wgzt8") pod "d67e22a2-dc2b-4582-bfe0-7afff25995fb" (UID: "d67e22a2-dc2b-4582-bfe0-7afff25995fb"). InnerVolumeSpecName "kube-api-access-wgzt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.984121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:53 crc kubenswrapper[4743]: I0122 13:46:53.993319 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d67e22a2-dc2b-4582-bfe0-7afff25995fb" (UID: "d67e22a2-dc2b-4582-bfe0-7afff25995fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.074331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.074415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.074527 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d67e22a2-dc2b-4582-bfe0-7afff25995fb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.074539 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgzt8\" (UniqueName: \"kubernetes.io/projected/d67e22a2-dc2b-4582-bfe0-7afff25995fb-kube-api-access-wgzt8\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.074548 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d67e22a2-dc2b-4582-bfe0-7afff25995fb-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.075501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.087149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.087591 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.101146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.116040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.214118 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerID="660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8" exitCode=0 Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.214271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerDied","Data":"660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8"} Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.219622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d048d869-32a2-451a-8623-278778ae772d","Type":"ContainerStarted","Data":"c00d720bfbce6acfde9250b256dd331f1f704470ab0aceff425894143ad603c0"} Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.262705 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.262683799 podStartE2EDuration="2.262683799s" podCreationTimestamp="2026-01-22 13:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:54.26200042 +0000 UTC m=+50.817043583" watchObservedRunningTime="2026-01-22 13:46:54.262683799 +0000 UTC m=+50.817726962" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.268859 4743 generic.go:334] "Generic (PLEG): container finished" podID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerID="eb4b0aa688975b1f7c3b6b0fa23e7be4c6498faf0e66dc85c6b393dee7afbaf5" exitCode=0 Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.268943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerDied","Data":"eb4b0aa688975b1f7c3b6b0fa23e7be4c6498faf0e66dc85c6b393dee7afbaf5"} Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.268978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerStarted","Data":"85fd1a6a478c59defe07b39c2f08a24b3643f5f8a46d83cc45995b7edbc3e6c0"} Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.313976 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:54 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:54 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:54 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.314026 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.314372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" event={"ID":"d67e22a2-dc2b-4582-bfe0-7afff25995fb","Type":"ContainerDied","Data":"ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663"} Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.314396 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec4d3ee5aca7166cea707957566de00402ac840b105c920688d0a317d88ce663" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.314449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc" Jan 22 13:46:54 crc kubenswrapper[4743]: W0122 13:46:54.589448 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9dbc35692840d43d34ac22abae22c3cd0350afc656c8d85f25b694a3fee97f94 WatchSource:0}: Error finding container 9dbc35692840d43d34ac22abae22c3cd0350afc656c8d85f25b694a3fee97f94: Status 404 returned error can't find the container with id 9dbc35692840d43d34ac22abae22c3cd0350afc656c8d85f25b694a3fee97f94 Jan 22 13:46:54 crc kubenswrapper[4743]: W0122 13:46:54.770309 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-8c2cf3ce9eb82d424d97dd64d4b24215e9fcfe314d4ae5a1be51111351d7e8d0 WatchSource:0}: Error finding container 8c2cf3ce9eb82d424d97dd64d4b24215e9fcfe314d4ae5a1be51111351d7e8d0: Status 404 returned error can't find the container with id 8c2cf3ce9eb82d424d97dd64d4b24215e9fcfe314d4ae5a1be51111351d7e8d0 Jan 22 13:46:54 crc kubenswrapper[4743]: W0122 13:46:54.813278 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ee2076eeb61cef21c8a32b7fd221e5db3248b903f4ef9bd61efbc3647a61f964 WatchSource:0}: Error finding container ee2076eeb61cef21c8a32b7fd221e5db3248b903f4ef9bd61efbc3647a61f964: Status 404 returned error can't find the container with id ee2076eeb61cef21c8a32b7fd221e5db3248b903f4ef9bd61efbc3647a61f964 Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.986220 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 13:46:54 crc kubenswrapper[4743]: E0122 13:46:54.986505 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67e22a2-dc2b-4582-bfe0-7afff25995fb" containerName="collect-profiles" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.986518 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67e22a2-dc2b-4582-bfe0-7afff25995fb" containerName="collect-profiles" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.986646 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67e22a2-dc2b-4582-bfe0-7afff25995fb" containerName="collect-profiles" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.991115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:54 crc kubenswrapper[4743]: I0122 13:46:54.996230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.000177 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.013544 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.092694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.092752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.194452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.194532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.194599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.227641 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.310204 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:55 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:55 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:55 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.310302 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.339505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.341333 4743 generic.go:334] "Generic (PLEG): container finished" podID="d048d869-32a2-451a-8623-278778ae772d" containerID="c00d720bfbce6acfde9250b256dd331f1f704470ab0aceff425894143ad603c0" exitCode=0 Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.341488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d048d869-32a2-451a-8623-278778ae772d","Type":"ContainerDied","Data":"c00d720bfbce6acfde9250b256dd331f1f704470ab0aceff425894143ad603c0"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.348828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"01a527bec8e79214574835b900f1bbaa0f6646b84ee9a019fd5a725c08c2cb5e"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.348881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8c2cf3ce9eb82d424d97dd64d4b24215e9fcfe314d4ae5a1be51111351d7e8d0"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.353076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9ec799768f84771135a9e65359d00470c53d079ef194c55274091f91d29416dd"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.353184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ee2076eeb61cef21c8a32b7fd221e5db3248b903f4ef9bd61efbc3647a61f964"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.360817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e95061cf9c9ea7c87f8b8e0c2428301e29bdb38150cdefa00aa5b555c663ea3"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.360869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9dbc35692840d43d34ac22abae22c3cd0350afc656c8d85f25b694a3fee97f94"} Jan 22 13:46:55 crc kubenswrapper[4743]: I0122 13:46:55.361245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.108412 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 13:46:56 crc kubenswrapper[4743]: W0122 13:46:56.142879 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod74708d66_f579_43dc_a6b8_7fa3ef003880.slice/crio-6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a WatchSource:0}: Error finding container 6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a: Status 404 returned error can't find the container with id 6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.311839 4743 patch_prober.go:28] interesting pod/router-default-5444994796-s9lg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 13:46:56 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Jan 22 13:46:56 crc kubenswrapper[4743]: [+]process-running ok Jan 22 13:46:56 crc kubenswrapper[4743]: healthz check failed Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.311902 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9lg4" podUID="2f6504ab-14a9-4c81-b8ae-556b648168db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.388203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74708d66-f579-43dc-a6b8-7fa3ef003880","Type":"ContainerStarted","Data":"6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a"} Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.848899 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.957585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir\") pod \"d048d869-32a2-451a-8623-278778ae772d\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.957685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d048d869-32a2-451a-8623-278778ae772d" (UID: "d048d869-32a2-451a-8623-278778ae772d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.957747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access\") pod \"d048d869-32a2-451a-8623-278778ae772d\" (UID: \"d048d869-32a2-451a-8623-278778ae772d\") " Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.958133 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d048d869-32a2-451a-8623-278778ae772d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:56 crc kubenswrapper[4743]: I0122 13:46:56.969022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d048d869-32a2-451a-8623-278778ae772d" (UID: "d048d869-32a2-451a-8623-278778ae772d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.060021 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d048d869-32a2-451a-8623-278778ae772d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.310762 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.313844 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-s9lg4" Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.449471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.450256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d048d869-32a2-451a-8623-278778ae772d","Type":"ContainerDied","Data":"997e5e29106c129ec48f52da5877b77ca78e5032371e6ae5e8852575293fe2dd"} Jan 22 13:46:57 crc kubenswrapper[4743]: I0122 13:46:57.450293 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="997e5e29106c129ec48f52da5877b77ca78e5032371e6ae5e8852575293fe2dd" Jan 22 13:46:58 crc kubenswrapper[4743]: I0122 13:46:58.442439 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 13:46:58 crc kubenswrapper[4743]: I0122 13:46:58.462346 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x7rm8" Jan 22 13:46:58 crc kubenswrapper[4743]: I0122 13:46:58.470330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74708d66-f579-43dc-a6b8-7fa3ef003880","Type":"ContainerStarted","Data":"f58f802ae27aa4b271d7a1119e404592e6cf6589c94b8fef1ffc02ba0552e5ac"} Jan 22 13:46:58 crc kubenswrapper[4743]: I0122 13:46:58.470413 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 22 13:46:59 crc kubenswrapper[4743]: I0122 13:46:59.545811 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.545775548 podStartE2EDuration="5.545775548s" podCreationTimestamp="2026-01-22 13:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:59.544292157 +0000 UTC m=+56.099335320" watchObservedRunningTime="2026-01-22 13:46:59.545775548 +0000 UTC m=+56.100818711" Jan 22 13:46:59 crc kubenswrapper[4743]: I0122 13:46:59.548634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.548626066 podStartE2EDuration="1.548626066s" podCreationTimestamp="2026-01-22 13:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:46:59.497302029 +0000 UTC m=+56.052345202" watchObservedRunningTime="2026-01-22 13:46:59.548626066 +0000 UTC m=+56.103669229" Jan 22 13:47:01 crc kubenswrapper[4743]: I0122 13:47:01.493900 4743 generic.go:334] "Generic (PLEG): container finished" podID="74708d66-f579-43dc-a6b8-7fa3ef003880" containerID="f58f802ae27aa4b271d7a1119e404592e6cf6589c94b8fef1ffc02ba0552e5ac" exitCode=0 Jan 22 13:47:01 crc kubenswrapper[4743]: I0122 13:47:01.493953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74708d66-f579-43dc-a6b8-7fa3ef003880","Type":"ContainerDied","Data":"f58f802ae27aa4b271d7a1119e404592e6cf6589c94b8fef1ffc02ba0552e5ac"} Jan 22 13:47:02 crc kubenswrapper[4743]: I0122 13:47:02.674469 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ksn26" Jan 22 13:47:03 crc kubenswrapper[4743]: I0122 13:47:03.117661 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:47:03 crc kubenswrapper[4743]: I0122 13:47:03.121971 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:47:03 crc kubenswrapper[4743]: E0122 13:47:03.488907 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:03 crc kubenswrapper[4743]: E0122 13:47:03.492352 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:03 crc kubenswrapper[4743]: E0122 13:47:03.493899 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:03 crc kubenswrapper[4743]: E0122 13:47:03.493941 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.280218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.352620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir\") pod \"74708d66-f579-43dc-a6b8-7fa3ef003880\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.352771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access\") pod \"74708d66-f579-43dc-a6b8-7fa3ef003880\" (UID: \"74708d66-f579-43dc-a6b8-7fa3ef003880\") " Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.352760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "74708d66-f579-43dc-a6b8-7fa3ef003880" (UID: "74708d66-f579-43dc-a6b8-7fa3ef003880"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.353066 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74708d66-f579-43dc-a6b8-7fa3ef003880-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.362375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "74708d66-f579-43dc-a6b8-7fa3ef003880" (UID: "74708d66-f579-43dc-a6b8-7fa3ef003880"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.454974 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74708d66-f579-43dc-a6b8-7fa3ef003880-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.524015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"74708d66-f579-43dc-a6b8-7fa3ef003880","Type":"ContainerDied","Data":"6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a"} Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.524064 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6014ad0890acb38d97b67d47ffcfa97dbe85419b60b52896a5f8d81c97d78e1a" Jan 22 13:47:06 crc kubenswrapper[4743]: I0122 13:47:06.524083 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 13:47:10 crc kubenswrapper[4743]: I0122 13:47:10.784321 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:47:13 crc kubenswrapper[4743]: E0122 13:47:13.487705 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:13 crc kubenswrapper[4743]: E0122 13:47:13.489426 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:13 crc kubenswrapper[4743]: E0122 13:47:13.490853 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:13 crc kubenswrapper[4743]: E0122 13:47:13.490923 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:47:22 crc kubenswrapper[4743]: I0122 13:47:22.767409 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 22 13:47:23 crc kubenswrapper[4743]: I0122 13:47:23.147697 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k57vs" Jan 22 13:47:23 crc kubenswrapper[4743]: I0122 13:47:23.243548 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.243532619 podStartE2EDuration="1.243532619s" podCreationTimestamp="2026-01-22 13:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:47:23.241942485 +0000 UTC m=+79.796985648" watchObservedRunningTime="2026-01-22 13:47:23.243532619 +0000 UTC m=+79.798575792" Jan 22 13:47:23 crc kubenswrapper[4743]: E0122 13:47:23.486312 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:23 crc kubenswrapper[4743]: E0122 13:47:23.486845 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:23 crc kubenswrapper[4743]: E0122 13:47:23.487353 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:23 crc kubenswrapper[4743]: E0122 13:47:23.487380 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:47:23 crc kubenswrapper[4743]: I0122 13:47:23.625778 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xrxrn_2887aee0-19a7-439f-8f40-ef40970ab796/kube-multus-additional-cni-plugins/0.log" Jan 22 13:47:23 crc kubenswrapper[4743]: I0122 13:47:23.625871 4743 generic.go:334] "Generic (PLEG): container finished" podID="2887aee0-19a7-439f-8f40-ef40970ab796" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" exitCode=137 Jan 22 13:47:23 crc kubenswrapper[4743]: I0122 13:47:23.626007 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" event={"ID":"2887aee0-19a7-439f-8f40-ef40970ab796","Type":"ContainerDied","Data":"17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363"} Jan 22 13:47:24 crc kubenswrapper[4743]: I0122 13:47:24.398017 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 13:47:29 crc kubenswrapper[4743]: E0122 13:47:29.601965 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 13:47:29 crc kubenswrapper[4743]: E0122 13:47:29.602433 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zvzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fqtsl_openshift-marketplace(c0e6768a-f15d-4daf-9e12-950e7d3b9552): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:29 crc kubenswrapper[4743]: E0122 13:47:29.604088 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fqtsl" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583160 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 13:47:30 crc kubenswrapper[4743]: E0122 13:47:30.583418 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d048d869-32a2-451a-8623-278778ae772d" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583428 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048d869-32a2-451a-8623-278778ae772d" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: E0122 13:47:30.583447 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74708d66-f579-43dc-a6b8-7fa3ef003880" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583454 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="74708d66-f579-43dc-a6b8-7fa3ef003880" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583545 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d048d869-32a2-451a-8623-278778ae772d" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583564 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="74708d66-f579-43dc-a6b8-7fa3ef003880" containerName="pruner" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.583986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.586408 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.590187 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.591961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.667301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.667397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.768483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.768590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.768690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.795381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:30 crc kubenswrapper[4743]: I0122 13:47:30.904644 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:33 crc kubenswrapper[4743]: E0122 13:47:33.511179 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:33 crc kubenswrapper[4743]: E0122 13:47:33.511567 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:33 crc kubenswrapper[4743]: E0122 13:47:33.511950 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:33 crc kubenswrapper[4743]: E0122 13:47:33.511982 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.167117 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.168157 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.184049 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.345663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.345722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.345751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.447333 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.447420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.447520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.447583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.447659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.469181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access\") pod \"installer-9-crc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:36 crc kubenswrapper[4743]: I0122 13:47:36.492686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:47:38 crc kubenswrapper[4743]: E0122 13:47:38.851193 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fqtsl" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" Jan 22 13:47:38 crc kubenswrapper[4743]: E0122 13:47:38.926323 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 13:47:38 crc kubenswrapper[4743]: E0122 13:47:38.927119 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x92k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bpk8r_openshift-marketplace(a660baca-6ead-4a0f-959b-24b3badc4a7c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:38 crc kubenswrapper[4743]: E0122 13:47:38.928320 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bpk8r" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" Jan 22 13:47:41 crc kubenswrapper[4743]: E0122 13:47:41.573623 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bpk8r" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.566268 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.566716 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x2kq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nmxmh_openshift-marketplace(128ee6cd-afe2-4c25-967b-9865fbb0ff88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.568962 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nmxmh" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.738050 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.739052 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ns8zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7bp9h_openshift-marketplace(fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:42 crc kubenswrapper[4743]: E0122 13:47:42.740732 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7bp9h" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.485589 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.486586 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.487186 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.487323 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.897963 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nmxmh" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.898028 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7bp9h" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" Jan 22 13:47:43 crc kubenswrapper[4743]: I0122 13:47:43.968334 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xrxrn_2887aee0-19a7-439f-8f40-ef40970ab796/kube-multus-additional-cni-plugins/0.log" Jan 22 13:47:43 crc kubenswrapper[4743]: I0122 13:47:43.968614 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.975306 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.975500 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hqvnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fgj9j_openshift-marketplace(d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:43 crc kubenswrapper[4743]: E0122 13:47:43.976939 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fgj9j" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.005407 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.005595 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lt8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7mpfs_openshift-marketplace(785dfb3f-6700-4e67-9ab4-df1d1e86efef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.006982 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7mpfs" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.026901 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.027038 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xzwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m8dtj_openshift-marketplace(9b61be23-4db1-4316-a840-1aaff04b664e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.028590 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-m8dtj" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.062549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir\") pod \"2887aee0-19a7-439f-8f40-ef40970ab796\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.062636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxb9p\" (UniqueName: \"kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p\") pod \"2887aee0-19a7-439f-8f40-ef40970ab796\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.062670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist\") pod \"2887aee0-19a7-439f-8f40-ef40970ab796\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.062678 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "2887aee0-19a7-439f-8f40-ef40970ab796" (UID: "2887aee0-19a7-439f-8f40-ef40970ab796"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.062743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready\") pod \"2887aee0-19a7-439f-8f40-ef40970ab796\" (UID: \"2887aee0-19a7-439f-8f40-ef40970ab796\") " Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.063098 4743 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2887aee0-19a7-439f-8f40-ef40970ab796-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.063427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready" (OuterVolumeSpecName: "ready") pod "2887aee0-19a7-439f-8f40-ef40970ab796" (UID: "2887aee0-19a7-439f-8f40-ef40970ab796"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.063688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "2887aee0-19a7-439f-8f40-ef40970ab796" (UID: "2887aee0-19a7-439f-8f40-ef40970ab796"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.084869 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.085039 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kknd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wv724_openshift-marketplace(a76ec049-d99a-40be-9fec-f76370769aea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.085852 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p" (OuterVolumeSpecName: "kube-api-access-lxb9p") pod "2887aee0-19a7-439f-8f40-ef40970ab796" (UID: "2887aee0-19a7-439f-8f40-ef40970ab796"). InnerVolumeSpecName "kube-api-access-lxb9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.086871 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wv724" podUID="a76ec049-d99a-40be-9fec-f76370769aea" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.146572 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 13:47:44 crc kubenswrapper[4743]: W0122 13:47:44.154815 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf44aeedd_10f9_4c57_88e4_50dbaf2794de.slice/crio-99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2 WatchSource:0}: Error finding container 99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2: Status 404 returned error can't find the container with id 99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2 Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.164211 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxb9p\" (UniqueName: \"kubernetes.io/projected/2887aee0-19a7-439f-8f40-ef40970ab796-kube-api-access-lxb9p\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.164246 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2887aee0-19a7-439f-8f40-ef40970ab796-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.164255 4743 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2887aee0-19a7-439f-8f40-ef40970ab796-ready\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.306511 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 13:47:44 crc kubenswrapper[4743]: W0122 13:47:44.325482 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod258cd1b9_f4a3_482d_bc94_25cc44b757dc.slice/crio-ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc WatchSource:0}: Error finding container ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc: Status 404 returned error can't find the container with id ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.747363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"258cd1b9-f4a3-482d-bc94-25cc44b757dc","Type":"ContainerStarted","Data":"78492532f1274c2907cd62b7dcdb4bca3641c0c5a347b11ffafcc8cb516e33bf"} Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.747408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"258cd1b9-f4a3-482d-bc94-25cc44b757dc","Type":"ContainerStarted","Data":"ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc"} Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.749879 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xrxrn_2887aee0-19a7-439f-8f40-ef40970ab796/kube-multus-additional-cni-plugins/0.log" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.749964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" event={"ID":"2887aee0-19a7-439f-8f40-ef40970ab796","Type":"ContainerDied","Data":"d193afb91657622c0902df22ada1c7e81579dbb5b5712b862da4205d8a20eb66"} Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.749996 4743 scope.go:117] "RemoveContainer" containerID="17902818d3cf20cb5b497c38f31f012e99c1e26d96835ba9f573fcbde313d363" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.750021 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xrxrn" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.752084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f44aeedd-10f9-4c57-88e4-50dbaf2794de","Type":"ContainerStarted","Data":"0313234bb798259cb431411a455b761d93707dc6e1e6b84df5b9c5bf576acd60"} Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.752119 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f44aeedd-10f9-4c57-88e4-50dbaf2794de","Type":"ContainerStarted","Data":"99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2"} Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.753213 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7mpfs" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.753845 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wv724" podUID="a76ec049-d99a-40be-9fec-f76370769aea" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.758735 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fgj9j" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" Jan 22 13:47:44 crc kubenswrapper[4743]: E0122 13:47:44.758894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m8dtj" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.772202 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.772185476 podStartE2EDuration="8.772185476s" podCreationTimestamp="2026-01-22 13:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:47:44.762668613 +0000 UTC m=+101.317711776" watchObservedRunningTime="2026-01-22 13:47:44.772185476 +0000 UTC m=+101.327228639" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.835530 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=14.835514146 podStartE2EDuration="14.835514146s" podCreationTimestamp="2026-01-22 13:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:47:44.834335003 +0000 UTC m=+101.389378166" watchObservedRunningTime="2026-01-22 13:47:44.835514146 +0000 UTC m=+101.390557309" Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.863678 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xrxrn"] Jan 22 13:47:44 crc kubenswrapper[4743]: I0122 13:47:44.866734 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xrxrn"] Jan 22 13:47:45 crc kubenswrapper[4743]: I0122 13:47:45.757750 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" path="/var/lib/kubelet/pods/2887aee0-19a7-439f-8f40-ef40970ab796/volumes" Jan 22 13:47:45 crc kubenswrapper[4743]: I0122 13:47:45.760326 4743 generic.go:334] "Generic (PLEG): container finished" podID="f44aeedd-10f9-4c57-88e4-50dbaf2794de" containerID="0313234bb798259cb431411a455b761d93707dc6e1e6b84df5b9c5bf576acd60" exitCode=0 Jan 22 13:47:45 crc kubenswrapper[4743]: I0122 13:47:45.761896 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f44aeedd-10f9-4c57-88e4-50dbaf2794de","Type":"ContainerDied","Data":"0313234bb798259cb431411a455b761d93707dc6e1e6b84df5b9c5bf576acd60"} Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.012411 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.200805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir\") pod \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.201172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access\") pod \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\" (UID: \"f44aeedd-10f9-4c57-88e4-50dbaf2794de\") " Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.200956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f44aeedd-10f9-4c57-88e4-50dbaf2794de" (UID: "f44aeedd-10f9-4c57-88e4-50dbaf2794de"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.202530 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.207231 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f44aeedd-10f9-4c57-88e4-50dbaf2794de" (UID: "f44aeedd-10f9-4c57-88e4-50dbaf2794de"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.303753 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f44aeedd-10f9-4c57-88e4-50dbaf2794de-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.770854 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f44aeedd-10f9-4c57-88e4-50dbaf2794de","Type":"ContainerDied","Data":"99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2"} Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.770895 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99bfb2563b44ebd8a387774dc19b5b100b27097be50befc342754f1e509652a2" Jan 22 13:47:47 crc kubenswrapper[4743]: I0122 13:47:47.770946 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 13:47:55 crc kubenswrapper[4743]: I0122 13:47:55.815590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerStarted","Data":"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7"} Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.828930 4743 generic.go:334] "Generic (PLEG): container finished" podID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerID="c3f7940077c0b7f80d200fcfc9f6266d95df3e8163825e472a463d412bf5ed73" exitCode=0 Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.829017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerDied","Data":"c3f7940077c0b7f80d200fcfc9f6266d95df3e8163825e472a463d412bf5ed73"} Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.835454 4743 generic.go:334] "Generic (PLEG): container finished" podID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerID="87d20127d4c6748e46af071455f71334be1d087760d32bde0d68c1aae52abe9b" exitCode=0 Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.835553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerDied","Data":"87d20127d4c6748e46af071455f71334be1d087760d32bde0d68c1aae52abe9b"} Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.839355 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerID="daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7" exitCode=0 Jan 22 13:47:56 crc kubenswrapper[4743]: I0122 13:47:56.839423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerDied","Data":"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.847942 4743 generic.go:334] "Generic (PLEG): container finished" podID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerID="15c49693b99d9c753461239ab1a3dd203cbb8b774397eafbd5725a37f5d2c613" exitCode=0 Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.847979 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerDied","Data":"15c49693b99d9c753461239ab1a3dd203cbb8b774397eafbd5725a37f5d2c613"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.852928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerStarted","Data":"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.855217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerStarted","Data":"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.857680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerStarted","Data":"1a7f3a5f36934933a9d5b6a167c5583844db3a57e41f44a0fb381f400d9fe650"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.860373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerStarted","Data":"aded755d2d2e600a9155e87104e5a3084cf2f2ff496f45d3a199198b8103b0e4"} Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.895595 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fgj9j" podStartSLOduration=2.748138166 podStartE2EDuration="1m8.895577239s" podCreationTimestamp="2026-01-22 13:46:49 +0000 UTC" firstStartedPulling="2026-01-22 13:46:51.110313341 +0000 UTC m=+47.665356504" lastFinishedPulling="2026-01-22 13:47:57.257752414 +0000 UTC m=+113.812795577" observedRunningTime="2026-01-22 13:47:57.89160965 +0000 UTC m=+114.446652823" watchObservedRunningTime="2026-01-22 13:47:57.895577239 +0000 UTC m=+114.450620402" Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.911118 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqtsl" podStartSLOduration=2.78675017 podStartE2EDuration="1m5.911095188s" podCreationTimestamp="2026-01-22 13:46:52 +0000 UTC" firstStartedPulling="2026-01-22 13:46:54.220730949 +0000 UTC m=+50.775774112" lastFinishedPulling="2026-01-22 13:47:57.345075967 +0000 UTC m=+113.900119130" observedRunningTime="2026-01-22 13:47:57.90754415 +0000 UTC m=+114.462587313" watchObservedRunningTime="2026-01-22 13:47:57.911095188 +0000 UTC m=+114.466138371" Jan 22 13:47:57 crc kubenswrapper[4743]: I0122 13:47:57.960461 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7mpfs" podStartSLOduration=2.8621216819999997 podStartE2EDuration="1m8.960446272s" podCreationTimestamp="2026-01-22 13:46:49 +0000 UTC" firstStartedPulling="2026-01-22 13:46:51.126121114 +0000 UTC m=+47.681164277" lastFinishedPulling="2026-01-22 13:47:57.224445704 +0000 UTC m=+113.779488867" observedRunningTime="2026-01-22 13:47:57.959205558 +0000 UTC m=+114.514248721" watchObservedRunningTime="2026-01-22 13:47:57.960446272 +0000 UTC m=+114.515489435" Jan 22 13:47:58 crc kubenswrapper[4743]: I0122 13:47:58.865875 4743 generic.go:334] "Generic (PLEG): container finished" podID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerID="44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57" exitCode=0 Jan 22 13:47:58 crc kubenswrapper[4743]: I0122 13:47:58.866036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerDied","Data":"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57"} Jan 22 13:47:58 crc kubenswrapper[4743]: I0122 13:47:58.871691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerStarted","Data":"962ff945f51c51af6f9187422534a8e59e0c62107d575ab91fd46e8908f5d4d2"} Jan 22 13:47:58 crc kubenswrapper[4743]: I0122 13:47:58.903122 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nmxmh" podStartSLOduration=1.786950956 podStartE2EDuration="1m7.903104341s" podCreationTimestamp="2026-01-22 13:46:51 +0000 UTC" firstStartedPulling="2026-01-22 13:46:52.139737091 +0000 UTC m=+48.694780254" lastFinishedPulling="2026-01-22 13:47:58.255890476 +0000 UTC m=+114.810933639" observedRunningTime="2026-01-22 13:47:58.90090645 +0000 UTC m=+115.455949613" watchObservedRunningTime="2026-01-22 13:47:58.903104341 +0000 UTC m=+115.458147504" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.612759 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.612839 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.665452 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.778953 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.779031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:47:59 crc kubenswrapper[4743]: I0122 13:47:59.821417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.885304 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b61be23-4db1-4316-a840-1aaff04b664e" containerID="3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55" exitCode=0 Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.885464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerDied","Data":"3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55"} Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.889005 4743 generic.go:334] "Generic (PLEG): container finished" podID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerID="92dfd1b301a92c14fdcb432e9a8f5fc6de4f24a2f580deaded4ae35184e38ac1" exitCode=0 Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.889071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerDied","Data":"92dfd1b301a92c14fdcb432e9a8f5fc6de4f24a2f580deaded4ae35184e38ac1"} Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.893059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerStarted","Data":"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae"} Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.896668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerStarted","Data":"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346"} Jan 22 13:48:00 crc kubenswrapper[4743]: I0122 13:48:00.965495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpk8r" podStartSLOduration=3.131268512 podStartE2EDuration="1m11.965481281s" podCreationTimestamp="2026-01-22 13:46:49 +0000 UTC" firstStartedPulling="2026-01-22 13:46:51.113242341 +0000 UTC m=+47.668285504" lastFinishedPulling="2026-01-22 13:47:59.94745511 +0000 UTC m=+116.502498273" observedRunningTime="2026-01-22 13:48:00.962822718 +0000 UTC m=+117.517865871" watchObservedRunningTime="2026-01-22 13:48:00.965481281 +0000 UTC m=+117.520524444" Jan 22 13:48:01 crc kubenswrapper[4743]: I0122 13:48:01.555267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:01 crc kubenswrapper[4743]: I0122 13:48:01.555327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:01 crc kubenswrapper[4743]: I0122 13:48:01.592640 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:01 crc kubenswrapper[4743]: I0122 13:48:01.901861 4743 generic.go:334] "Generic (PLEG): container finished" podID="a76ec049-d99a-40be-9fec-f76370769aea" containerID="87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346" exitCode=0 Jan 22 13:48:01 crc kubenswrapper[4743]: I0122 13:48:01.901951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerDied","Data":"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346"} Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.566738 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.567047 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.909342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerStarted","Data":"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3"} Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.911927 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerStarted","Data":"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197"} Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.914816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerStarted","Data":"b6b0fc9ed336a425a74df536915f8c124da7addbe3d25ea8f2b0c81b3280f7c6"} Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.932437 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wv724" podStartSLOduration=3.7381938200000002 podStartE2EDuration="1m14.932419845s" podCreationTimestamp="2026-01-22 13:46:48 +0000 UTC" firstStartedPulling="2026-01-22 13:46:51.118497635 +0000 UTC m=+47.673540798" lastFinishedPulling="2026-01-22 13:48:02.31272366 +0000 UTC m=+118.867766823" observedRunningTime="2026-01-22 13:48:02.926995495 +0000 UTC m=+119.482038668" watchObservedRunningTime="2026-01-22 13:48:02.932419845 +0000 UTC m=+119.487463008" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.964841 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.964888 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.972836 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7bp9h" podStartSLOduration=2.979170375 podStartE2EDuration="1m10.97277514s" podCreationTimestamp="2026-01-22 13:46:52 +0000 UTC" firstStartedPulling="2026-01-22 13:46:54.278538614 +0000 UTC m=+50.833581777" lastFinishedPulling="2026-01-22 13:48:02.272143379 +0000 UTC m=+118.827186542" observedRunningTime="2026-01-22 13:48:02.969820399 +0000 UTC m=+119.524863562" watchObservedRunningTime="2026-01-22 13:48:02.97277514 +0000 UTC m=+119.527818303" Jan 22 13:48:02 crc kubenswrapper[4743]: I0122 13:48:02.973239 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8dtj" podStartSLOduration=2.971179532 podStartE2EDuration="1m12.973233603s" podCreationTimestamp="2026-01-22 13:46:50 +0000 UTC" firstStartedPulling="2026-01-22 13:46:52.164355446 +0000 UTC m=+48.719398609" lastFinishedPulling="2026-01-22 13:48:02.166409517 +0000 UTC m=+118.721452680" observedRunningTime="2026-01-22 13:48:02.953837897 +0000 UTC m=+119.508881060" watchObservedRunningTime="2026-01-22 13:48:02.973233603 +0000 UTC m=+119.528276766" Jan 22 13:48:03 crc kubenswrapper[4743]: I0122 13:48:03.609974 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqtsl" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="registry-server" probeResult="failure" output=< Jan 22 13:48:03 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 13:48:03 crc kubenswrapper[4743]: > Jan 22 13:48:04 crc kubenswrapper[4743]: I0122 13:48:04.001822 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7bp9h" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="registry-server" probeResult="failure" output=< Jan 22 13:48:04 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 13:48:04 crc kubenswrapper[4743]: > Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.205966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.206518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.243467 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.436984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.437310 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.471590 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.645736 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.835893 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.981655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:48:09 crc kubenswrapper[4743]: I0122 13:48:09.985154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:48:10 crc kubenswrapper[4743]: I0122 13:48:10.383092 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:48:10 crc kubenswrapper[4743]: I0122 13:48:10.383315 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7mpfs" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="registry-server" containerID="cri-o://1a7f3a5f36934933a9d5b6a167c5583844db3a57e41f44a0fb381f400d9fe650" gracePeriod=2 Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.143266 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.143308 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.195835 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.616989 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.796889 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:48:11 crc kubenswrapper[4743]: I0122 13:48:11.797346 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fgj9j" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="registry-server" containerID="cri-o://aded755d2d2e600a9155e87104e5a3084cf2f2ff496f45d3a199198b8103b0e4" gracePeriod=2 Jan 22 13:48:12 crc kubenswrapper[4743]: I0122 13:48:12.004416 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:48:12 crc kubenswrapper[4743]: I0122 13:48:12.605200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:48:12 crc kubenswrapper[4743]: I0122 13:48:12.657584 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:48:13 crc kubenswrapper[4743]: I0122 13:48:13.079534 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:13 crc kubenswrapper[4743]: I0122 13:48:13.124151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:14 crc kubenswrapper[4743]: I0122 13:48:14.182124 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:48:14 crc kubenswrapper[4743]: I0122 13:48:14.182434 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nmxmh" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="registry-server" containerID="cri-o://962ff945f51c51af6f9187422534a8e59e0c62107d575ab91fd46e8908f5d4d2" gracePeriod=2 Jan 22 13:48:15 crc kubenswrapper[4743]: I0122 13:48:15.484218 4743 generic.go:334] "Generic (PLEG): container finished" podID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerID="1a7f3a5f36934933a9d5b6a167c5583844db3a57e41f44a0fb381f400d9fe650" exitCode=0 Jan 22 13:48:15 crc kubenswrapper[4743]: I0122 13:48:15.484305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerDied","Data":"1a7f3a5f36934933a9d5b6a167c5583844db3a57e41f44a0fb381f400d9fe650"} Jan 22 13:48:15 crc kubenswrapper[4743]: I0122 13:48:15.893296 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.092710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities\") pod \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.093143 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content\") pod \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.093183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lt8d\" (UniqueName: \"kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d\") pod \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\" (UID: \"785dfb3f-6700-4e67-9ab4-df1d1e86efef\") " Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.093989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities" (OuterVolumeSpecName: "utilities") pod "785dfb3f-6700-4e67-9ab4-df1d1e86efef" (UID: "785dfb3f-6700-4e67-9ab4-df1d1e86efef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.105210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d" (OuterVolumeSpecName: "kube-api-access-6lt8d") pod "785dfb3f-6700-4e67-9ab4-df1d1e86efef" (UID: "785dfb3f-6700-4e67-9ab4-df1d1e86efef"). InnerVolumeSpecName "kube-api-access-6lt8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.155403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "785dfb3f-6700-4e67-9ab4-df1d1e86efef" (UID: "785dfb3f-6700-4e67-9ab4-df1d1e86efef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.194365 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.194397 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lt8d\" (UniqueName: \"kubernetes.io/projected/785dfb3f-6700-4e67-9ab4-df1d1e86efef-kube-api-access-6lt8d\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.194408 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785dfb3f-6700-4e67-9ab4-df1d1e86efef-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.490839 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgj9j_d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85/registry-server/0.log" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.491662 4743 generic.go:334] "Generic (PLEG): container finished" podID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerID="aded755d2d2e600a9155e87104e5a3084cf2f2ff496f45d3a199198b8103b0e4" exitCode=137 Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.491697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerDied","Data":"aded755d2d2e600a9155e87104e5a3084cf2f2ff496f45d3a199198b8103b0e4"} Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.493870 4743 generic.go:334] "Generic (PLEG): container finished" podID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerID="962ff945f51c51af6f9187422534a8e59e0c62107d575ab91fd46e8908f5d4d2" exitCode=0 Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.493924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerDied","Data":"962ff945f51c51af6f9187422534a8e59e0c62107d575ab91fd46e8908f5d4d2"} Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.496020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mpfs" event={"ID":"785dfb3f-6700-4e67-9ab4-df1d1e86efef","Type":"ContainerDied","Data":"555009e9bbfaa517e62321b02dfeaaab7ffc3233a60e2159e582c465bc86c339"} Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.496051 4743 scope.go:117] "RemoveContainer" containerID="1a7f3a5f36934933a9d5b6a167c5583844db3a57e41f44a0fb381f400d9fe650" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.496156 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mpfs" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.510536 4743 scope.go:117] "RemoveContainer" containerID="c3f7940077c0b7f80d200fcfc9f6266d95df3e8163825e472a463d412bf5ed73" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.519543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.525721 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7mpfs"] Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.553225 4743 scope.go:117] "RemoveContainer" containerID="ab704eeb5abde2c5b147226cafaa7789231aefddc49b52f60cb1703c2863b1e3" Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.581853 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.582069 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7bp9h" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="registry-server" containerID="cri-o://b6b0fc9ed336a425a74df536915f8c124da7addbe3d25ea8f2b0c81b3280f7c6" gracePeriod=2 Jan 22 13:48:16 crc kubenswrapper[4743]: I0122 13:48:16.918462 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.104928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x2kq\" (UniqueName: \"kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq\") pod \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.105937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content\") pod \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.106001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities\") pod \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\" (UID: \"128ee6cd-afe2-4c25-967b-9865fbb0ff88\") " Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.106726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities" (OuterVolumeSpecName: "utilities") pod "128ee6cd-afe2-4c25-967b-9865fbb0ff88" (UID: "128ee6cd-afe2-4c25-967b-9865fbb0ff88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.112270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq" (OuterVolumeSpecName: "kube-api-access-8x2kq") pod "128ee6cd-afe2-4c25-967b-9865fbb0ff88" (UID: "128ee6cd-afe2-4c25-967b-9865fbb0ff88"). InnerVolumeSpecName "kube-api-access-8x2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.132651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128ee6cd-afe2-4c25-967b-9865fbb0ff88" (UID: "128ee6cd-afe2-4c25-967b-9865fbb0ff88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.207745 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.207815 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x2kq\" (UniqueName: \"kubernetes.io/projected/128ee6cd-afe2-4c25-967b-9865fbb0ff88-kube-api-access-8x2kq\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.207838 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ee6cd-afe2-4c25-967b-9865fbb0ff88-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.211669 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hqfjq"] Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.502720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmxmh" event={"ID":"128ee6cd-afe2-4c25-967b-9865fbb0ff88","Type":"ContainerDied","Data":"f4c6a9cd1fda0fa4b4221488c2c34eb006edacaa2ea5342870784f17111aaa71"} Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.502780 4743 scope.go:117] "RemoveContainer" containerID="962ff945f51c51af6f9187422534a8e59e0c62107d575ab91fd46e8908f5d4d2" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.502898 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmxmh" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.521654 4743 scope.go:117] "RemoveContainer" containerID="15c49693b99d9c753461239ab1a3dd203cbb8b774397eafbd5725a37f5d2c613" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.528320 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.533304 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmxmh"] Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.563310 4743 scope.go:117] "RemoveContainer" containerID="9d89230033d8c7d055538f44ab929830e5c24ef05d9bff63616ad8f084b57170" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.755096 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" path="/var/lib/kubelet/pods/128ee6cd-afe2-4c25-967b-9865fbb0ff88/volumes" Jan 22 13:48:17 crc kubenswrapper[4743]: I0122 13:48:17.755996 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" path="/var/lib/kubelet/pods/785dfb3f-6700-4e67-9ab4-df1d1e86efef/volumes" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.334625 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgj9j_d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85/registry-server/0.log" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.335646 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.527373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content\") pod \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.527437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities\") pod \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.527472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqvnc\" (UniqueName: \"kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc\") pod \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\" (UID: \"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85\") " Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.528399 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities" (OuterVolumeSpecName: "utilities") pod "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" (UID: "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.536494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc" (OuterVolumeSpecName: "kube-api-access-hqvnc") pod "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" (UID: "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85"). InnerVolumeSpecName "kube-api-access-hqvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.540775 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgj9j_d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85/registry-server/0.log" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.541534 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgj9j" event={"ID":"d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85","Type":"ContainerDied","Data":"3e53380e4d4ab595b37cf336cd056fcc1e02c6c83a56dbd855893f57923a3c35"} Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.541662 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgj9j" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.541663 4743 scope.go:117] "RemoveContainer" containerID="aded755d2d2e600a9155e87104e5a3084cf2f2ff496f45d3a199198b8103b0e4" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.566024 4743 scope.go:117] "RemoveContainer" containerID="87d20127d4c6748e46af071455f71334be1d087760d32bde0d68c1aae52abe9b" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.578931 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" (UID: "d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.582059 4743 scope.go:117] "RemoveContainer" containerID="cf1b9e01a26bb95cce31ba7b86c4f3407c4d86e19d14ad64a621e67713ec754f" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.628502 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.628555 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqvnc\" (UniqueName: \"kubernetes.io/projected/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-kube-api-access-hqvnc\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.628569 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.859629 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:48:19 crc kubenswrapper[4743]: I0122 13:48:19.862265 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fgj9j"] Jan 22 13:48:20 crc kubenswrapper[4743]: I0122 13:48:20.551525 4743 generic.go:334] "Generic (PLEG): container finished" podID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerID="b6b0fc9ed336a425a74df536915f8c124da7addbe3d25ea8f2b0c81b3280f7c6" exitCode=0 Jan 22 13:48:20 crc kubenswrapper[4743]: I0122 13:48:20.551613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerDied","Data":"b6b0fc9ed336a425a74df536915f8c124da7addbe3d25ea8f2b0c81b3280f7c6"} Jan 22 13:48:20 crc kubenswrapper[4743]: I0122 13:48:20.882425 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.046930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities\") pod \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.047262 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content\") pod \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.047304 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns8zp\" (UniqueName: \"kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp\") pod \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\" (UID: \"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce\") " Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.048044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities" (OuterVolumeSpecName: "utilities") pod "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" (UID: "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.051837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp" (OuterVolumeSpecName: "kube-api-access-ns8zp") pod "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" (UID: "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce"). InnerVolumeSpecName "kube-api-access-ns8zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.148962 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns8zp\" (UniqueName: \"kubernetes.io/projected/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-kube-api-access-ns8zp\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.148992 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.162003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" (UID: "fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.249733 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.561010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bp9h" event={"ID":"fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce","Type":"ContainerDied","Data":"85fd1a6a478c59defe07b39c2f08a24b3643f5f8a46d83cc45995b7edbc3e6c0"} Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.561064 4743 scope.go:117] "RemoveContainer" containerID="b6b0fc9ed336a425a74df536915f8c124da7addbe3d25ea8f2b0c81b3280f7c6" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.561104 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bp9h" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.576771 4743 scope.go:117] "RemoveContainer" containerID="92dfd1b301a92c14fdcb432e9a8f5fc6de4f24a2f580deaded4ae35184e38ac1" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.600891 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.605571 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7bp9h"] Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.606465 4743 scope.go:117] "RemoveContainer" containerID="eb4b0aa688975b1f7c3b6b0fa23e7be4c6498faf0e66dc85c6b393dee7afbaf5" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.755412 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" path="/var/lib/kubelet/pods/d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85/volumes" Jan 22 13:48:21 crc kubenswrapper[4743]: I0122 13:48:21.756218 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" path="/var/lib/kubelet/pods/fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce/volumes" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273394 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273760 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273783 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273815 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273824 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273836 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273844 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273854 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273861 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273871 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44aeedd-10f9-4c57-88e4-50dbaf2794de" containerName="pruner" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273880 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44aeedd-10f9-4c57-88e4-50dbaf2794de" containerName="pruner" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273889 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273896 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273905 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273911 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273921 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273929 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="extract-utilities" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273937 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273944 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273953 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273960 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273975 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273982 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.273991 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.273997 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.274012 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274018 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="extract-content" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.274028 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274036 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274142 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2a0da8-a3d8-4f70-a0a8-d2f1150c40ce" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274155 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="128ee6cd-afe2-4c25-967b-9865fbb0ff88" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274165 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2887aee0-19a7-439f-8f40-ef40970ab796" containerName="kube-multus-additional-cni-plugins" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274176 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="785dfb3f-6700-4e67-9ab4-df1d1e86efef" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274186 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44aeedd-10f9-4c57-88e4-50dbaf2794de" containerName="pruner" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274196 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eaca2f-a489-4a21-9bfc-4a8fc02ccc85" containerName="registry-server" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.274676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275411 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275693 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a" gracePeriod=15 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275725 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954" gracePeriod=15 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275755 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e" gracePeriod=15 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275829 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b" gracePeriod=15 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.275946 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d" gracePeriod=15 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.276020 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.276258 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.276352 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.276430 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.276496 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.276578 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.276670 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.277920 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278001 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.278071 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278215 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.278295 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278364 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 13:48:22 crc kubenswrapper[4743]: E0122 13:48:22.278507 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278587 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278816 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.278923 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.279006 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.279088 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.279171 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.279247 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465948 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.465987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.466037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.568866 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.569933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.568981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.570016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.570782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.570833 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.571443 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.571865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.571704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.571930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.572377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.572613 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.572835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.573193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.573411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.573279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.573479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.574422 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.574943 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954" exitCode=0 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.574968 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d" exitCode=0 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.574976 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e" exitCode=0 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.574982 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b" exitCode=2 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.575036 4743 scope.go:117] "RemoveContainer" containerID="dbf3b047cd21b7972e3e78bc51c2907693d461a6825dec6dac248b62200e63b9" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.580727 4743 generic.go:334] "Generic (PLEG): container finished" podID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" containerID="78492532f1274c2907cd62b7dcdb4bca3641c0c5a347b11ffafcc8cb516e33bf" exitCode=0 Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.580753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"258cd1b9-f4a3-482d-bc94-25cc44b757dc","Type":"ContainerDied","Data":"78492532f1274c2907cd62b7dcdb4bca3641c0c5a347b11ffafcc8cb516e33bf"} Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.581578 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:22 crc kubenswrapper[4743]: I0122 13:48:22.581958 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.591332 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.749448 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.749980 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.820822 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.821715 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access\") pod \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir\") pod \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock\") pod \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\" (UID: \"258cd1b9-f4a3-482d-bc94-25cc44b757dc\") " Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "258cd1b9-f4a3-482d-bc94-25cc44b757dc" (UID: "258cd1b9-f4a3-482d-bc94-25cc44b757dc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889493 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock" (OuterVolumeSpecName: "var-lock") pod "258cd1b9-f4a3-482d-bc94-25cc44b757dc" (UID: "258cd1b9-f4a3-482d-bc94-25cc44b757dc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889624 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.889642 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/258cd1b9-f4a3-482d-bc94-25cc44b757dc-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.894715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "258cd1b9-f4a3-482d-bc94-25cc44b757dc" (UID: "258cd1b9-f4a3-482d-bc94-25cc44b757dc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:23 crc kubenswrapper[4743]: I0122 13:48:23.990718 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/258cd1b9-f4a3-482d-bc94-25cc44b757dc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.605901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"258cd1b9-f4a3-482d-bc94-25cc44b757dc","Type":"ContainerDied","Data":"ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc"} Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.606155 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca98fa33d6357c5c9c73d857e8adcc8a3ecde14e09aafed0fab19dccfe789bdc" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.606104 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.652828 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.657559 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.658419 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.658940 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.659443 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699192 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699241 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699407 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699476 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.699513 4743 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:24 crc kubenswrapper[4743]: I0122 13:48:24.800570 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.614685 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.615428 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a" exitCode=0 Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.615482 4743 scope.go:117] "RemoveContainer" containerID="9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.615515 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.628450 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.628938 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.631992 4743 scope.go:117] "RemoveContainer" containerID="3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.646735 4743 scope.go:117] "RemoveContainer" containerID="70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.662754 4743 scope.go:117] "RemoveContainer" containerID="79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.675716 4743 scope.go:117] "RemoveContainer" containerID="c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.689197 4743 scope.go:117] "RemoveContainer" containerID="9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.711575 4743 scope.go:117] "RemoveContainer" containerID="9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.711935 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954\": container with ID starting with 9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954 not found: ID does not exist" containerID="9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.711979 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954"} err="failed to get container status \"9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954\": rpc error: code = NotFound desc = could not find container \"9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954\": container with ID starting with 9fa58feeb3650411bb7ae470ddb3767a9f055924ec26d29029c7284e7a590954 not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.712031 4743 scope.go:117] "RemoveContainer" containerID="3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.712289 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\": container with ID starting with 3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d not found: ID does not exist" containerID="3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.712313 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d"} err="failed to get container status \"3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\": rpc error: code = NotFound desc = could not find container \"3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d\": container with ID starting with 3307aad48db4a49584f0fc9767fd8852d5902aa05bb0c6e3b58eee2bc9b7ab3d not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.712328 4743 scope.go:117] "RemoveContainer" containerID="70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.712553 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\": container with ID starting with 70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e not found: ID does not exist" containerID="70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.712588 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e"} err="failed to get container status \"70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\": rpc error: code = NotFound desc = could not find container \"70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e\": container with ID starting with 70f164abc35c1a84289433b80aeba8eb6edfd0a0a4d2801a0336c01a1fd2ba5e not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.712609 4743 scope.go:117] "RemoveContainer" containerID="79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.713339 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\": container with ID starting with 79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b not found: ID does not exist" containerID="79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.713375 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b"} err="failed to get container status \"79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\": rpc error: code = NotFound desc = could not find container \"79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b\": container with ID starting with 79f0fd53d4d043cada398136d26691a12b62cc2765bef4f42db7af827bafaa9b not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.713397 4743 scope.go:117] "RemoveContainer" containerID="c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.713954 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\": container with ID starting with c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a not found: ID does not exist" containerID="c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.714042 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a"} err="failed to get container status \"c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\": rpc error: code = NotFound desc = could not find container \"c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a\": container with ID starting with c3d207c8f7c47106e794aa561ec19fd9bba5216c729f0ee0211f124b368c4e9a not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.714103 4743 scope.go:117] "RemoveContainer" containerID="9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322" Jan 22 13:48:25 crc kubenswrapper[4743]: E0122 13:48:25.714622 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\": container with ID starting with 9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322 not found: ID does not exist" containerID="9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.714656 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322"} err="failed to get container status \"9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\": rpc error: code = NotFound desc = could not find container \"9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322\": container with ID starting with 9b518be682055721b8f28ad1a53f3ab76ea8e2b5385525d395669facbb2e6322 not found: ID does not exist" Jan 22 13:48:25 crc kubenswrapper[4743]: I0122 13:48:25.755472 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.723932 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.724623 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.725193 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.725562 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.725899 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:26 crc kubenswrapper[4743]: I0122 13:48:26.725938 4743 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.726230 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="200ms" Jan 22 13:48:26 crc kubenswrapper[4743]: E0122 13:48:26.927074 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="400ms" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.040649 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T13:48:27Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T13:48:27Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T13:48:27Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T13:48:27Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.041062 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.041533 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.042082 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.042362 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.042385 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.308893 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.53:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:27 crc kubenswrapper[4743]: I0122 13:48:27.309481 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.328168 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="800ms" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.334625 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.53:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d11b73d6aa316 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 13:48:27.333559062 +0000 UTC m=+143.888602225,LastTimestamp:2026-01-22 13:48:27.333559062 +0000 UTC m=+143.888602225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 13:48:27 crc kubenswrapper[4743]: I0122 13:48:27.634093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695"} Jan 22 13:48:27 crc kubenswrapper[4743]: I0122 13:48:27.634433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f48951beafcf8895559cc6645c8687bfe89e0de93664dc0779a829ad8ad042ce"} Jan 22 13:48:27 crc kubenswrapper[4743]: I0122 13:48:27.634993 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:27 crc kubenswrapper[4743]: E0122 13:48:27.635204 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.53:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:48:28 crc kubenswrapper[4743]: E0122 13:48:28.129752 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="1.6s" Jan 22 13:48:28 crc kubenswrapper[4743]: E0122 13:48:28.849323 4743 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.53:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" volumeName="registry-storage" Jan 22 13:48:29 crc kubenswrapper[4743]: E0122 13:48:29.734039 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="3.2s" Jan 22 13:48:30 crc kubenswrapper[4743]: I0122 13:48:30.049064 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:48:30 crc kubenswrapper[4743]: I0122 13:48:30.049154 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:48:32 crc kubenswrapper[4743]: E0122 13:48:32.935482 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.53:6443: connect: connection refused" interval="6.4s" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.747145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.749729 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.750444 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.764228 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.764263 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:33 crc kubenswrapper[4743]: E0122 13:48:33.764616 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:33 crc kubenswrapper[4743]: I0122 13:48:33.765084 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.680562 4743 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="aa5f7f424eff7b81b1cdf0788c2cccbb68392630f0989d33bd25bfddc67c2847" exitCode=0 Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.680637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"aa5f7f424eff7b81b1cdf0788c2cccbb68392630f0989d33bd25bfddc67c2847"} Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.680859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"094d2a03167e16b516817bb1d55b2cd7efc766ae52675e1e9856ba15db8f4972"} Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.681118 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.681130 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:34 crc kubenswrapper[4743]: I0122 13:48:34.681927 4743 status_manager.go:851] "Failed to get status for pod" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" Jan 22 13:48:34 crc kubenswrapper[4743]: E0122 13:48:34.681978 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.53:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:35 crc kubenswrapper[4743]: I0122 13:48:35.698302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"12ff8a1a2a7950ea56d0846a62e2f75c26c551b9fec8b6def2016704749bddd9"} Jan 22 13:48:35 crc kubenswrapper[4743]: I0122 13:48:35.698633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f16b02e98b602eef0ee268346b5fb16cc64f48f7a1af2fda019b68af34679d8f"} Jan 22 13:48:35 crc kubenswrapper[4743]: I0122 13:48:35.698648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"889b8254bb12e92c60e448314ef00519b97b19ccf6821e025250493b70c4f337"} Jan 22 13:48:36 crc kubenswrapper[4743]: I0122 13:48:36.712059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc0dead687b7259e82c65a57790f7cdad089ea567fbb7d93bbfb7a019786a605"} Jan 22 13:48:36 crc kubenswrapper[4743]: I0122 13:48:36.712395 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:36 crc kubenswrapper[4743]: I0122 13:48:36.712426 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:36 crc kubenswrapper[4743]: I0122 13:48:36.712448 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:36 crc kubenswrapper[4743]: I0122 13:48:36.712464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80c97c65af8e9e251c96dfd16e05240baa8b5b3044c4aa8b56a8a4847dfced7b"} Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.246627 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.247058 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.719864 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.719919 4743 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bfd37a286a817f44e77fd221d3f495b76049d2cddeff485afe9290b6f9c4c859" exitCode=1 Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.719954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bfd37a286a817f44e77fd221d3f495b76049d2cddeff485afe9290b6f9c4c859"} Jan 22 13:48:37 crc kubenswrapper[4743]: I0122 13:48:37.720409 4743 scope.go:117] "RemoveContainer" containerID="bfd37a286a817f44e77fd221d3f495b76049d2cddeff485afe9290b6f9c4c859" Jan 22 13:48:38 crc kubenswrapper[4743]: I0122 13:48:38.727035 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 13:48:38 crc kubenswrapper[4743]: I0122 13:48:38.727300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4282329185a84f1d1023318b64b9eb3fd1caa0ec2f57c770db14dd7a34e1aee"} Jan 22 13:48:38 crc kubenswrapper[4743]: I0122 13:48:38.766186 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:38 crc kubenswrapper[4743]: I0122 13:48:38.766250 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:38 crc kubenswrapper[4743]: I0122 13:48:38.772735 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:41 crc kubenswrapper[4743]: I0122 13:48:41.722251 4743 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:41 crc kubenswrapper[4743]: I0122 13:48:41.744946 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:41 crc kubenswrapper[4743]: I0122 13:48:41.744974 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:41 crc kubenswrapper[4743]: I0122 13:48:41.763644 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.027428 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.242393 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" podUID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" containerName="oauth-openshift" containerID="cri-o://9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa" gracePeriod=15 Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.650781 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748083 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748159 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748205 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkrnb\" (UniqueName: \"kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748534 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748562 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748595 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data\") pod \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\" (UID: \"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52\") " Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.748193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.749715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.750316 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.750605 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.750807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753012 4743 generic.go:334] "Generic (PLEG): container finished" podID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" containerID="9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa" exitCode=0 Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753435 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753456 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="01f4ccb0-f73c-4886-ba33-0e37b40563fa" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" event={"ID":"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52","Type":"ContainerDied","Data":"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa"} Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" event={"ID":"16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52","Type":"ContainerDied","Data":"021bb78246b8b350bd61a0db115e9b49d160d027ac95e66b38cfce36eae942c3"} Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.753909 4743 scope.go:117] "RemoveContainer" containerID="9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.754480 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hqfjq" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.756559 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.757756 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb" (OuterVolumeSpecName: "kube-api-access-mkrnb") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "kube-api-access-mkrnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.766195 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.766720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.766726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.766933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.769974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.770080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.770395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" (UID: "16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.804772 4743 scope.go:117] "RemoveContainer" containerID="9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa" Jan 22 13:48:42 crc kubenswrapper[4743]: E0122 13:48:42.805271 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa\": container with ID starting with 9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa not found: ID does not exist" containerID="9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.805390 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa"} err="failed to get container status \"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa\": rpc error: code = NotFound desc = could not find container \"9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa\": container with ID starting with 9bf822386f18fd85c93d838c48df3696887a1c0f2e91269637929f5d2fec8bfa not found: ID does not exist" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849520 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849558 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849568 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849580 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849590 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849599 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849608 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849617 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849625 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkrnb\" (UniqueName: \"kubernetes.io/projected/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-kube-api-access-mkrnb\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849633 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849642 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849650 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849658 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:42 crc kubenswrapper[4743]: I0122 13:48:42.849713 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 13:48:43 crc kubenswrapper[4743]: I0122 13:48:43.757321 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e2c86ed2-9714-44c8-ac16-6b221bccf4cc" Jan 22 13:48:45 crc kubenswrapper[4743]: I0122 13:48:45.114839 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:48:45 crc kubenswrapper[4743]: I0122 13:48:45.114998 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 22 13:48:45 crc kubenswrapper[4743]: I0122 13:48:45.115039 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 22 13:48:52 crc kubenswrapper[4743]: I0122 13:48:52.122445 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 13:48:52 crc kubenswrapper[4743]: I0122 13:48:52.175376 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 13:48:52 crc kubenswrapper[4743]: I0122 13:48:52.518955 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 13:48:53 crc kubenswrapper[4743]: I0122 13:48:53.035648 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 13:48:53 crc kubenswrapper[4743]: I0122 13:48:53.485743 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 13:48:53 crc kubenswrapper[4743]: I0122 13:48:53.874258 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.275084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.635860 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.661294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.680501 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.698550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.707067 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.794939 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 13:48:54 crc kubenswrapper[4743]: I0122 13:48:54.810249 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.014274 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.115245 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.115306 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.244971 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.460365 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.882805 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 13:48:55 crc kubenswrapper[4743]: I0122 13:48:55.910856 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.151155 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.185663 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.187511 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.315458 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.401903 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.427838 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.530606 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.560076 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.575378 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.757585 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.796721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.835895 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.864153 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 13:48:56 crc kubenswrapper[4743]: I0122 13:48:56.987599 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.073637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.092095 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.220072 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.286005 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.288303 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.291660 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.305145 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.386154 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.418752 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.418971 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.595898 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.617180 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.745339 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.792839 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.818337 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.825691 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.837897 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 13:48:57 crc kubenswrapper[4743]: I0122 13:48:57.845264 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.013551 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.116480 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.118710 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.151274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.153409 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.217930 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.264203 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.475969 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.523990 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.570462 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.578328 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.584154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.687775 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.848249 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.852714 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.884909 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 13:48:58 crc kubenswrapper[4743]: I0122 13:48:58.937681 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.019182 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.084592 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.085194 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.086922 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.288461 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.304675 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.399425 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.427091 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.453809 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.468587 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.501071 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.501145 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.569193 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.574475 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.603263 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.670569 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.749476 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.771867 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.786418 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.850345 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.855593 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.883103 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.893188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 13:48:59 crc kubenswrapper[4743]: I0122 13:48:59.970034 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.049318 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.049388 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.088091 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.217781 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.245688 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.249476 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hqfjq","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.249542 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.253854 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.270714 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.270692851 podStartE2EDuration="19.270692851s" podCreationTimestamp="2026-01-22 13:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:49:00.26882305 +0000 UTC m=+176.823866213" watchObservedRunningTime="2026-01-22 13:49:00.270692851 +0000 UTC m=+176.825736014" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.272602 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.279187 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.303775 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.377093 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.499311 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.552561 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.690178 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.697811 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.709245 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-bdv5j"] Jan 22 13:49:00 crc kubenswrapper[4743]: E0122 13:49:00.709489 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" containerName="installer" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.709512 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" containerName="installer" Jan 22 13:49:00 crc kubenswrapper[4743]: E0122 13:49:00.709527 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" containerName="oauth-openshift" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.709536 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" containerName="oauth-openshift" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.709652 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="258cd1b9-f4a3-482d-bc94-25cc44b757dc" containerName="installer" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.709673 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" containerName="oauth-openshift" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.710097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.713680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.713740 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.713804 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714004 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714116 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714446 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714617 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714838 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714921 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.714925 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.715109 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.715167 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.731689 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-bdv5j"] Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.732065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.740822 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.744238 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902048 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-dir\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902749 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.902997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.903035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.903063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-policies\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.903139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfb29\" (UniqueName: \"kubernetes.io/projected/0d49fe27-d0f1-4776-8fde-6d5f63374099-kube-api-access-sfb29\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.903157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.908659 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 13:49:00 crc kubenswrapper[4743]: I0122 13:49:00.917607 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-policies\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfb29\" (UniqueName: \"kubernetes.io/projected/0d49fe27-d0f1-4776-8fde-6d5f63374099-kube-api-access-sfb29\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003774 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-dir\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003808 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.003937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-dir\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.005366 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.005603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-audit-policies\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.005685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.007197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.009234 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.009725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.010581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.010667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.010925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.013299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.016453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.017094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49fe27-d0f1-4776-8fde-6d5f63374099-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.018816 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.021254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfb29\" (UniqueName: \"kubernetes.io/projected/0d49fe27-d0f1-4776-8fde-6d5f63374099-kube-api-access-sfb29\") pod \"oauth-openshift-6cb668d466-bdv5j\" (UID: \"0d49fe27-d0f1-4776-8fde-6d5f63374099\") " pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.032679 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.050543 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.079925 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.108238 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.114604 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.119944 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.195111 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.218409 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.261174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-bdv5j"] Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.285724 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.337903 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.341995 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.348642 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.363129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.402232 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.417543 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.428984 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.437489 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.464106 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.530636 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.541206 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.551426 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.562981 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.643319 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.650374 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.731139 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.753664 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52" path="/var/lib/kubelet/pods/16a01ff3-73f3-4cf6-9f0a-94f1f1cc0a52/volumes" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.855588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.855902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" event={"ID":"0d49fe27-d0f1-4776-8fde-6d5f63374099","Type":"ContainerStarted","Data":"939e520492388874c6eab74443c7a3ebc9b43fef22e1c2c60c7507ae4c93baae"} Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.855958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" event={"ID":"0d49fe27-d0f1-4776-8fde-6d5f63374099","Type":"ContainerStarted","Data":"e520de5ff16096cafb9ca95cc2c64cdb347630c50921db5bed31e56840b1c3ed"} Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.856235 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.858864 4743 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-bdv5j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.858911 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" podUID="0d49fe27-d0f1-4776-8fde-6d5f63374099" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.868633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.890311 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.891401 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.914185 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.927416 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 13:49:01 crc kubenswrapper[4743]: I0122 13:49:01.956657 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.094821 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.215570 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.221736 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.348148 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.453002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.463556 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.476230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.476773 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.679635 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.691632 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.735564 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.818918 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.863270 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cb668d466-bdv5j_0d49fe27-d0f1-4776-8fde-6d5f63374099/oauth-openshift/0.log" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.863323 4743 generic.go:334] "Generic (PLEG): container finished" podID="0d49fe27-d0f1-4776-8fde-6d5f63374099" containerID="939e520492388874c6eab74443c7a3ebc9b43fef22e1c2c60c7507ae4c93baae" exitCode=255 Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.863353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" event={"ID":"0d49fe27-d0f1-4776-8fde-6d5f63374099","Type":"ContainerDied","Data":"939e520492388874c6eab74443c7a3ebc9b43fef22e1c2c60c7507ae4c93baae"} Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.863849 4743 scope.go:117] "RemoveContainer" containerID="939e520492388874c6eab74443c7a3ebc9b43fef22e1c2c60c7507ae4c93baae" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.875915 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.898831 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.961595 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.983395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 13:49:02 crc kubenswrapper[4743]: I0122 13:49:02.984742 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.063714 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.070748 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.308348 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.380618 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.471796 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.485293 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.568189 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.699230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.702150 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.806116 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.818503 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.831181 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.872207 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cb668d466-bdv5j_0d49fe27-d0f1-4776-8fde-6d5f63374099/oauth-openshift/1.log" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.872571 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cb668d466-bdv5j_0d49fe27-d0f1-4776-8fde-6d5f63374099/oauth-openshift/0.log" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.872608 4743 generic.go:334] "Generic (PLEG): container finished" podID="0d49fe27-d0f1-4776-8fde-6d5f63374099" containerID="104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e" exitCode=255 Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.872635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" event={"ID":"0d49fe27-d0f1-4776-8fde-6d5f63374099","Type":"ContainerDied","Data":"104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e"} Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.872666 4743 scope.go:117] "RemoveContainer" containerID="939e520492388874c6eab74443c7a3ebc9b43fef22e1c2c60c7507ae4c93baae" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.873158 4743 scope.go:117] "RemoveContainer" containerID="104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e" Jan 22 13:49:03 crc kubenswrapper[4743]: E0122 13:49:03.873371 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6cb668d466-bdv5j_openshift-authentication(0d49fe27-d0f1-4776-8fde-6d5f63374099)\"" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" podUID="0d49fe27-d0f1-4776-8fde-6d5f63374099" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.924609 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 13:49:03 crc kubenswrapper[4743]: I0122 13:49:03.936533 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.029200 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.035849 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.226566 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.236088 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.327843 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.328092 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695" gracePeriod=5 Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.460683 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.522397 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.551223 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.597244 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.673008 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.677541 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.710012 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.768308 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.784322 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.839110 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.848976 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.879076 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cb668d466-bdv5j_0d49fe27-d0f1-4776-8fde-6d5f63374099/oauth-openshift/1.log" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.879610 4743 scope.go:117] "RemoveContainer" containerID="104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e" Jan 22 13:49:04 crc kubenswrapper[4743]: E0122 13:49:04.879891 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6cb668d466-bdv5j_openshift-authentication(0d49fe27-d0f1-4776-8fde-6d5f63374099)\"" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" podUID="0d49fe27-d0f1-4776-8fde-6d5f63374099" Jan 22 13:49:04 crc kubenswrapper[4743]: I0122 13:49:04.980438 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.076471 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.077615 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.098402 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.114927 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.114998 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.115066 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.116174 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"d4282329185a84f1d1023318b64b9eb3fd1caa0ec2f57c770db14dd7a34e1aee"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.116365 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://d4282329185a84f1d1023318b64b9eb3fd1caa0ec2f57c770db14dd7a34e1aee" gracePeriod=30 Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.150065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.190850 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.213160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.234053 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.249381 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.351978 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.419201 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.434587 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.532485 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.589893 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.656498 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.685754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.797858 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.838773 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 13:49:05 crc kubenswrapper[4743]: I0122 13:49:05.893513 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.026419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.187822 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.410172 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.515607 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.753667 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.780156 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.781935 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.833254 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 13:49:06 crc kubenswrapper[4743]: I0122 13:49:06.886274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.005143 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.123395 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.314247 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.340815 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.446184 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.510505 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.610580 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.610725 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 13:49:07 crc kubenswrapper[4743]: I0122 13:49:07.611157 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.157738 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.177677 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.460190 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.468104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.672184 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 13:49:08 crc kubenswrapper[4743]: I0122 13:49:08.972088 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.028303 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.050455 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.291243 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.469910 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.749829 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.817129 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.886132 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.886246 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.913176 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.913230 4743 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695" exitCode=137 Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.913278 4743 scope.go:117] "RemoveContainer" containerID="d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.913305 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.928179 4743 scope.go:117] "RemoveContainer" containerID="d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695" Jan 22 13:49:09 crc kubenswrapper[4743]: E0122 13:49:09.928638 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695\": container with ID starting with d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695 not found: ID does not exist" containerID="d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695" Jan 22 13:49:09 crc kubenswrapper[4743]: I0122 13:49:09.928695 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695"} err="failed to get container status \"d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695\": rpc error: code = NotFound desc = could not find container \"d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695\": container with ID starting with d0c9e29410a953b1d244e6634da20839e174fcf4f8f28f16fd32c23982bc0695 not found: ID does not exist" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022375 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022478 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022540 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022811 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022878 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022940 4743 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.022954 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.031179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.119298 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.120927 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.123727 4743 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.123758 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.123768 4743 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.725424 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 13:49:10 crc kubenswrapper[4743]: I0122 13:49:10.812900 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 13:49:11 crc kubenswrapper[4743]: I0122 13:49:11.033882 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:11 crc kubenswrapper[4743]: I0122 13:49:11.034174 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:11 crc kubenswrapper[4743]: I0122 13:49:11.034571 4743 scope.go:117] "RemoveContainer" containerID="104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e" Jan 22 13:49:11 crc kubenswrapper[4743]: E0122 13:49:11.034859 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-6cb668d466-bdv5j_openshift-authentication(0d49fe27-d0f1-4776-8fde-6d5f63374099)\"" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" podUID="0d49fe27-d0f1-4776-8fde-6d5f63374099" Jan 22 13:49:11 crc kubenswrapper[4743]: I0122 13:49:11.717301 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 13:49:11 crc kubenswrapper[4743]: I0122 13:49:11.756961 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 22 13:49:25 crc kubenswrapper[4743]: I0122 13:49:25.747644 4743 scope.go:117] "RemoveContainer" containerID="104889abd358d43fb70d34c3a32001c00e398945fbe7f4930af4e410aff05e3e" Jan 22 13:49:27 crc kubenswrapper[4743]: I0122 13:49:27.018629 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cb668d466-bdv5j_0d49fe27-d0f1-4776-8fde-6d5f63374099/oauth-openshift/1.log" Jan 22 13:49:27 crc kubenswrapper[4743]: I0122 13:49:27.019082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" event={"ID":"0d49fe27-d0f1-4776-8fde-6d5f63374099","Type":"ContainerStarted","Data":"62a581e75fd6e3d6a640a26ea1e6cb5a98d56537cb6d5099af1b01714662239f"} Jan 22 13:49:27 crc kubenswrapper[4743]: I0122 13:49:27.019469 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:27 crc kubenswrapper[4743]: I0122 13:49:27.027415 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" Jan 22 13:49:27 crc kubenswrapper[4743]: I0122 13:49:27.042842 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cb668d466-bdv5j" podStartSLOduration=70.042770088 podStartE2EDuration="1m10.042770088s" podCreationTimestamp="2026-01-22 13:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:49:01.89024738 +0000 UTC m=+178.445290543" watchObservedRunningTime="2026-01-22 13:49:27.042770088 +0000 UTC m=+203.597813291" Jan 22 13:49:30 crc kubenswrapper[4743]: I0122 13:49:30.048945 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:49:30 crc kubenswrapper[4743]: I0122 13:49:30.049359 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:49:30 crc kubenswrapper[4743]: I0122 13:49:30.049438 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:49:30 crc kubenswrapper[4743]: I0122 13:49:30.050236 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 13:49:30 crc kubenswrapper[4743]: I0122 13:49:30.050327 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001" gracePeriod=600 Jan 22 13:49:31 crc kubenswrapper[4743]: I0122 13:49:31.039048 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001" exitCode=0 Jan 22 13:49:31 crc kubenswrapper[4743]: I0122 13:49:31.039134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001"} Jan 22 13:49:31 crc kubenswrapper[4743]: I0122 13:49:31.039647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12"} Jan 22 13:49:36 crc kubenswrapper[4743]: I0122 13:49:36.070184 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 22 13:49:36 crc kubenswrapper[4743]: I0122 13:49:36.073161 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 13:49:36 crc kubenswrapper[4743]: I0122 13:49:36.073219 4743 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4282329185a84f1d1023318b64b9eb3fd1caa0ec2f57c770db14dd7a34e1aee" exitCode=137 Jan 22 13:49:36 crc kubenswrapper[4743]: I0122 13:49:36.073256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4282329185a84f1d1023318b64b9eb3fd1caa0ec2f57c770db14dd7a34e1aee"} Jan 22 13:49:36 crc kubenswrapper[4743]: I0122 13:49:36.073298 4743 scope.go:117] "RemoveContainer" containerID="bfd37a286a817f44e77fd221d3f495b76049d2cddeff485afe9290b6f9c4c859" Jan 22 13:49:37 crc kubenswrapper[4743]: I0122 13:49:37.080543 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 22 13:49:37 crc kubenswrapper[4743]: I0122 13:49:37.082290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4be0d0e1d397816ecc33663d7e0848763da9e60f0e15017f9200567da8e80a3"} Jan 22 13:49:42 crc kubenswrapper[4743]: I0122 13:49:42.026617 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:49:45 crc kubenswrapper[4743]: I0122 13:49:45.115297 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:49:45 crc kubenswrapper[4743]: I0122 13:49:45.122758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:49:52 crc kubenswrapper[4743]: I0122 13:49:52.032369 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.232261 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.232476 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerName="controller-manager" containerID="cri-o://e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249" gracePeriod=30 Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.236192 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.236400 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" podUID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" containerName="route-controller-manager" containerID="cri-o://a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21" gracePeriod=30 Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.601618 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.708463 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.749393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert\") pod \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.749504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config\") pod \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.749535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca\") pod \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.749569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgqlg\" (UniqueName: \"kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg\") pod \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\" (UID: \"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.750812 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca" (OuterVolumeSpecName: "client-ca") pod "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" (UID: "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.750917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config" (OuterVolumeSpecName: "config") pod "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" (UID: "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.762329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg" (OuterVolumeSpecName: "kube-api-access-mgqlg") pod "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" (UID: "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0"). InnerVolumeSpecName "kube-api-access-mgqlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.762359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" (UID: "10cbe2ea-3b69-40a7-88a4-86e1f57dbab0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.770943 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:49:56 crc kubenswrapper[4743]: E0122 13:49:56.771243 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" containerName="route-controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771257 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" containerName="route-controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: E0122 13:49:56.771271 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771277 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 13:49:56 crc kubenswrapper[4743]: E0122 13:49:56.771294 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerName="controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771301 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerName="controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771409 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771419 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerName="controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771432 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" containerName="route-controller-manager" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.771966 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.784046 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.850317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n9ts\" (UniqueName: \"kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts\") pod \"037eda14-3c3c-4b24-bb18-dea65e3e4548\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.850405 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config\") pod \"037eda14-3c3c-4b24-bb18-dea65e3e4548\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.850449 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles\") pod \"037eda14-3c3c-4b24-bb18-dea65e3e4548\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.850531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert\") pod \"037eda14-3c3c-4b24-bb18-dea65e3e4548\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.850561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca\") pod \"037eda14-3c3c-4b24-bb18-dea65e3e4548\" (UID: \"037eda14-3c3c-4b24-bb18-dea65e3e4548\") " Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "037eda14-3c3c-4b24-bb18-dea65e3e4548" (UID: "037eda14-3c3c-4b24-bb18-dea65e3e4548"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config" (OuterVolumeSpecName: "config") pod "037eda14-3c3c-4b24-bb18-dea65e3e4548" (UID: "037eda14-3c3c-4b24-bb18-dea65e3e4548"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851378 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca" (OuterVolumeSpecName: "client-ca") pod "037eda14-3c3c-4b24-bb18-dea65e3e4548" (UID: "037eda14-3c3c-4b24-bb18-dea65e3e4548"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851724 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851761 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgqlg\" (UniqueName: \"kubernetes.io/projected/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-kube-api-access-mgqlg\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851777 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.851806 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.853528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "037eda14-3c3c-4b24-bb18-dea65e3e4548" (UID: "037eda14-3c3c-4b24-bb18-dea65e3e4548"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.853567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts" (OuterVolumeSpecName: "kube-api-access-2n9ts") pod "037eda14-3c3c-4b24-bb18-dea65e3e4548" (UID: "037eda14-3c3c-4b24-bb18-dea65e3e4548"). InnerVolumeSpecName "kube-api-access-2n9ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.952444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8xs5\" (UniqueName: \"kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953479 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953503 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953517 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037eda14-3c3c-4b24-bb18-dea65e3e4548-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953529 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037eda14-3c3c-4b24-bb18-dea65e3e4548-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:56 crc kubenswrapper[4743]: I0122 13:49:56.953540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n9ts\" (UniqueName: \"kubernetes.io/projected/037eda14-3c3c-4b24-bb18-dea65e3e4548-kube-api-access-2n9ts\") on node \"crc\" DevicePath \"\"" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.054691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.054742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.054818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8xs5\" (UniqueName: \"kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.054864 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.055937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.056212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.058430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.070772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8xs5\" (UniqueName: \"kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5\") pod \"route-controller-manager-d794b4758-vkhv7\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.100838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.195522 4743 generic.go:334] "Generic (PLEG): container finished" podID="037eda14-3c3c-4b24-bb18-dea65e3e4548" containerID="e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249" exitCode=0 Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.195585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" event={"ID":"037eda14-3c3c-4b24-bb18-dea65e3e4548","Type":"ContainerDied","Data":"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249"} Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.195617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" event={"ID":"037eda14-3c3c-4b24-bb18-dea65e3e4548","Type":"ContainerDied","Data":"60308cb3fa1a115371e63b1a33bd016d486bfabbd2247edbdbd5a88523bbbbe9"} Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.195640 4743 scope.go:117] "RemoveContainer" containerID="e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.195743 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5dt9" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.199293 4743 generic.go:334] "Generic (PLEG): container finished" podID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" containerID="a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21" exitCode=0 Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.199337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" event={"ID":"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0","Type":"ContainerDied","Data":"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21"} Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.199363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" event={"ID":"10cbe2ea-3b69-40a7-88a4-86e1f57dbab0","Type":"ContainerDied","Data":"d27f4c357a548776262ff4d3d0cf4c3a00e29cae40acdc5d4ead360ac77d1702"} Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.199415 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.226365 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.237249 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5dt9"] Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.242489 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.242621 4743 scope.go:117] "RemoveContainer" containerID="e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249" Jan 22 13:49:57 crc kubenswrapper[4743]: E0122 13:49:57.243382 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249\": container with ID starting with e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249 not found: ID does not exist" containerID="e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.243416 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249"} err="failed to get container status \"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249\": rpc error: code = NotFound desc = could not find container \"e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249\": container with ID starting with e50e7c409b08056d208437bdd67bec4628d52cfbcf321da69ff0fb59e3a48249 not found: ID does not exist" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.243437 4743 scope.go:117] "RemoveContainer" containerID="a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.247710 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6j5s8"] Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.258826 4743 scope.go:117] "RemoveContainer" containerID="a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21" Jan 22 13:49:57 crc kubenswrapper[4743]: E0122 13:49:57.259465 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21\": container with ID starting with a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21 not found: ID does not exist" containerID="a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.259505 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21"} err="failed to get container status \"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21\": rpc error: code = NotFound desc = could not find container \"a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21\": container with ID starting with a8dbbfa149d5f07c97e0ba126f85bc6196706f95d71fce59c363348c8b96fd21 not found: ID does not exist" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.297673 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.753221 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037eda14-3c3c-4b24-bb18-dea65e3e4548" path="/var/lib/kubelet/pods/037eda14-3c3c-4b24-bb18-dea65e3e4548/volumes" Jan 22 13:49:57 crc kubenswrapper[4743]: I0122 13:49:57.753849 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cbe2ea-3b69-40a7-88a4-86e1f57dbab0" path="/var/lib/kubelet/pods/10cbe2ea-3b69-40a7-88a4-86e1f57dbab0/volumes" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.205025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" event={"ID":"cf9cdaec-7d8f-4958-bc45-bdd843e96b90","Type":"ContainerStarted","Data":"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463"} Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.205059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" event={"ID":"cf9cdaec-7d8f-4958-bc45-bdd843e96b90","Type":"ContainerStarted","Data":"f98c446ab971fbc5193c9dc4e70d8e7f0e000694b89f408f6aeecf8e15e55ed1"} Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.205220 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.209666 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.224669 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" podStartSLOduration=2.224652032 podStartE2EDuration="2.224652032s" podCreationTimestamp="2026-01-22 13:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:49:58.220626001 +0000 UTC m=+234.775669164" watchObservedRunningTime="2026-01-22 13:49:58.224652032 +0000 UTC m=+234.779695195" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.501445 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.502509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505200 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505415 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505651 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505656 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.505815 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.513421 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.514152 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.673063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6hj\" (UniqueName: \"kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.673127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.673192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.673217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.673382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.774381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.774441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6hj\" (UniqueName: \"kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.774879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.775017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.775047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.776310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.776843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.777110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.780143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.790322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6hj\" (UniqueName: \"kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj\") pod \"controller-manager-dfd546d69-mj272\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.820203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:58 crc kubenswrapper[4743]: I0122 13:49:58.989003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:49:58 crc kubenswrapper[4743]: W0122 13:49:58.992228 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2055418_5648_45f2_9b7a_90977d621c4f.slice/crio-583478344b0e263ce4b7c3336634c379eee28f62f7a108e3c98509dadea90bf0 WatchSource:0}: Error finding container 583478344b0e263ce4b7c3336634c379eee28f62f7a108e3c98509dadea90bf0: Status 404 returned error can't find the container with id 583478344b0e263ce4b7c3336634c379eee28f62f7a108e3c98509dadea90bf0 Jan 22 13:49:59 crc kubenswrapper[4743]: I0122 13:49:59.211538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" event={"ID":"e2055418-5648-45f2-9b7a-90977d621c4f","Type":"ContainerStarted","Data":"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f"} Jan 22 13:49:59 crc kubenswrapper[4743]: I0122 13:49:59.211846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" event={"ID":"e2055418-5648-45f2-9b7a-90977d621c4f","Type":"ContainerStarted","Data":"583478344b0e263ce4b7c3336634c379eee28f62f7a108e3c98509dadea90bf0"} Jan 22 13:49:59 crc kubenswrapper[4743]: I0122 13:49:59.211982 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:59 crc kubenswrapper[4743]: I0122 13:49:59.215907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:49:59 crc kubenswrapper[4743]: I0122 13:49:59.225539 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" podStartSLOduration=3.225524621 podStartE2EDuration="3.225524621s" podCreationTimestamp="2026-01-22 13:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:49:59.225007337 +0000 UTC m=+235.780050510" watchObservedRunningTime="2026-01-22 13:49:59.225524621 +0000 UTC m=+235.780567774" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.289865 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.294159 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.294660 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bpk8r" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="registry-server" containerID="cri-o://e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae" gracePeriod=30 Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.295042 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wv724" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="registry-server" containerID="cri-o://bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3" gracePeriod=30 Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.303119 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.303362 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" containerID="cri-o://448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c" gracePeriod=30 Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.318846 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.319416 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m8dtj" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="registry-server" containerID="cri-o://b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197" gracePeriod=30 Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.323490 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6sdp5"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.324700 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.327428 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.327649 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqtsl" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="registry-server" containerID="cri-o://79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7" gracePeriod=30 Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.333963 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6sdp5"] Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.344594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.344777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.344931 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpmg\" (UniqueName: \"kubernetes.io/projected/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-kube-api-access-ljpmg\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.447102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.447206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.447234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpmg\" (UniqueName: \"kubernetes.io/projected/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-kube-api-access-ljpmg\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.448751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.465224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.467898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpmg\" (UniqueName: \"kubernetes.io/projected/ac4d223b-b4ca-485a-aa22-1fbdb0a3228e-kube-api-access-ljpmg\") pod \"marketplace-operator-79b997595-6sdp5\" (UID: \"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.642237 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.806549 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.855437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngkjj\" (UniqueName: \"kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj\") pod \"6f60519c-a85e-483e-ac46-8cde2dbbd166\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.855490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca\") pod \"6f60519c-a85e-483e-ac46-8cde2dbbd166\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.855519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics\") pod \"6f60519c-a85e-483e-ac46-8cde2dbbd166\" (UID: \"6f60519c-a85e-483e-ac46-8cde2dbbd166\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.856949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6f60519c-a85e-483e-ac46-8cde2dbbd166" (UID: "6f60519c-a85e-483e-ac46-8cde2dbbd166"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.865771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6f60519c-a85e-483e-ac46-8cde2dbbd166" (UID: "6f60519c-a85e-483e-ac46-8cde2dbbd166"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.868771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj" (OuterVolumeSpecName: "kube-api-access-ngkjj") pod "6f60519c-a85e-483e-ac46-8cde2dbbd166" (UID: "6f60519c-a85e-483e-ac46-8cde2dbbd166"). InnerVolumeSpecName "kube-api-access-ngkjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.885891 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.900740 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.932402 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.938772 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content\") pod \"a660baca-6ead-4a0f-959b-24b3badc4a7c\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content\") pod \"a76ec049-d99a-40be-9fec-f76370769aea\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956835 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities\") pod \"a660baca-6ead-4a0f-959b-24b3badc4a7c\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknd6\" (UniqueName: \"kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6\") pod \"a76ec049-d99a-40be-9fec-f76370769aea\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956884 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zvzf\" (UniqueName: \"kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf\") pod \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.956900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities\") pod \"a76ec049-d99a-40be-9fec-f76370769aea\" (UID: \"a76ec049-d99a-40be-9fec-f76370769aea\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities\") pod \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content\") pod \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\" (UID: \"c0e6768a-f15d-4daf-9e12-950e7d3b9552\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957045 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities\") pod \"9b61be23-4db1-4316-a840-1aaff04b664e\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xzwx\" (UniqueName: \"kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx\") pod \"9b61be23-4db1-4316-a840-1aaff04b664e\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content\") pod \"9b61be23-4db1-4316-a840-1aaff04b664e\" (UID: \"9b61be23-4db1-4316-a840-1aaff04b664e\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92k2\" (UniqueName: \"kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2\") pod \"a660baca-6ead-4a0f-959b-24b3badc4a7c\" (UID: \"a660baca-6ead-4a0f-959b-24b3badc4a7c\") " Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957246 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngkjj\" (UniqueName: \"kubernetes.io/projected/6f60519c-a85e-483e-ac46-8cde2dbbd166-kube-api-access-ngkjj\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957259 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957268 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f60519c-a85e-483e-ac46-8cde2dbbd166-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.957528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities" (OuterVolumeSpecName: "utilities") pod "a660baca-6ead-4a0f-959b-24b3badc4a7c" (UID: "a660baca-6ead-4a0f-959b-24b3badc4a7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.958308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities" (OuterVolumeSpecName: "utilities") pod "9b61be23-4db1-4316-a840-1aaff04b664e" (UID: "9b61be23-4db1-4316-a840-1aaff04b664e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.959644 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities" (OuterVolumeSpecName: "utilities") pod "c0e6768a-f15d-4daf-9e12-950e7d3b9552" (UID: "c0e6768a-f15d-4daf-9e12-950e7d3b9552"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.959817 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities" (OuterVolumeSpecName: "utilities") pod "a76ec049-d99a-40be-9fec-f76370769aea" (UID: "a76ec049-d99a-40be-9fec-f76370769aea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.967331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6" (OuterVolumeSpecName: "kube-api-access-kknd6") pod "a76ec049-d99a-40be-9fec-f76370769aea" (UID: "a76ec049-d99a-40be-9fec-f76370769aea"). InnerVolumeSpecName "kube-api-access-kknd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.968204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2" (OuterVolumeSpecName: "kube-api-access-x92k2") pod "a660baca-6ead-4a0f-959b-24b3badc4a7c" (UID: "a660baca-6ead-4a0f-959b-24b3badc4a7c"). InnerVolumeSpecName "kube-api-access-x92k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.971018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx" (OuterVolumeSpecName: "kube-api-access-6xzwx") pod "9b61be23-4db1-4316-a840-1aaff04b664e" (UID: "9b61be23-4db1-4316-a840-1aaff04b664e"). InnerVolumeSpecName "kube-api-access-6xzwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:04 crc kubenswrapper[4743]: I0122 13:50:04.975142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf" (OuterVolumeSpecName: "kube-api-access-2zvzf") pod "c0e6768a-f15d-4daf-9e12-950e7d3b9552" (UID: "c0e6768a-f15d-4daf-9e12-950e7d3b9552"). InnerVolumeSpecName "kube-api-access-2zvzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.015099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b61be23-4db1-4316-a840-1aaff04b664e" (UID: "9b61be23-4db1-4316-a840-1aaff04b664e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.048964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a76ec049-d99a-40be-9fec-f76370769aea" (UID: "a76ec049-d99a-40be-9fec-f76370769aea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059319 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92k2\" (UniqueName: \"kubernetes.io/projected/a660baca-6ead-4a0f-959b-24b3badc4a7c-kube-api-access-x92k2\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059353 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059362 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059372 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kknd6\" (UniqueName: \"kubernetes.io/projected/a76ec049-d99a-40be-9fec-f76370769aea-kube-api-access-kknd6\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059380 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zvzf\" (UniqueName: \"kubernetes.io/projected/c0e6768a-f15d-4daf-9e12-950e7d3b9552-kube-api-access-2zvzf\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059390 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76ec049-d99a-40be-9fec-f76370769aea-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059399 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059407 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059415 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xzwx\" (UniqueName: \"kubernetes.io/projected/9b61be23-4db1-4316-a840-1aaff04b664e-kube-api-access-6xzwx\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.059423 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b61be23-4db1-4316-a840-1aaff04b664e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.070979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a660baca-6ead-4a0f-959b-24b3badc4a7c" (UID: "a660baca-6ead-4a0f-959b-24b3badc4a7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.121621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0e6768a-f15d-4daf-9e12-950e7d3b9552" (UID: "c0e6768a-f15d-4daf-9e12-950e7d3b9552"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.160250 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e6768a-f15d-4daf-9e12-950e7d3b9552-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.160513 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a660baca-6ead-4a0f-959b-24b3badc4a7c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.225802 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6sdp5"] Jan 22 13:50:05 crc kubenswrapper[4743]: W0122 13:50:05.229835 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac4d223b_b4ca_485a_aa22_1fbdb0a3228e.slice/crio-2b40a3835c2ab26ec3496285fee6f7c7de83eee655a4ba0bbc4b61e23bc65157 WatchSource:0}: Error finding container 2b40a3835c2ab26ec3496285fee6f7c7de83eee655a4ba0bbc4b61e23bc65157: Status 404 returned error can't find the container with id 2b40a3835c2ab26ec3496285fee6f7c7de83eee655a4ba0bbc4b61e23bc65157 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.245509 4743 generic.go:334] "Generic (PLEG): container finished" podID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerID="e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae" exitCode=0 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.245703 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpk8r" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.245688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerDied","Data":"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.246089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpk8r" event={"ID":"a660baca-6ead-4a0f-959b-24b3badc4a7c","Type":"ContainerDied","Data":"a56a2cd6a0de824ee5728ab9e57416e1650607ebc5ac41bd345c491957ad1273"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.246128 4743 scope.go:117] "RemoveContainer" containerID="e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.250212 4743 generic.go:334] "Generic (PLEG): container finished" podID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerID="448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c" exitCode=0 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.250268 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.250313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" event={"ID":"6f60519c-a85e-483e-ac46-8cde2dbbd166","Type":"ContainerDied","Data":"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.250382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjjf5" event={"ID":"6f60519c-a85e-483e-ac46-8cde2dbbd166","Type":"ContainerDied","Data":"f4959a17324562f8b08ccb0d6a28c333701263083efea53c73ae3c1fb1034b13"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.254119 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerID="79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7" exitCode=0 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.254157 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtsl" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.254169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerDied","Data":"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.254946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtsl" event={"ID":"c0e6768a-f15d-4daf-9e12-950e7d3b9552","Type":"ContainerDied","Data":"e47982e7a73e6cd216e87f7bb30bb0d5ebb62086988813f245c3c09be4f625bb"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.260518 4743 generic.go:334] "Generic (PLEG): container finished" podID="a76ec049-d99a-40be-9fec-f76370769aea" containerID="bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3" exitCode=0 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.260590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerDied","Data":"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.260614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv724" event={"ID":"a76ec049-d99a-40be-9fec-f76370769aea","Type":"ContainerDied","Data":"22199ecb6471a38e9629e3062982ae9e97c1e964f6990a137bfde127b6a9fd25"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.260618 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv724" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.262072 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" event={"ID":"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e","Type":"ContainerStarted","Data":"2b40a3835c2ab26ec3496285fee6f7c7de83eee655a4ba0bbc4b61e23bc65157"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.263934 4743 scope.go:117] "RemoveContainer" containerID="44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.268872 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b61be23-4db1-4316-a840-1aaff04b664e" containerID="b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197" exitCode=0 Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.268929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerDied","Data":"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.268981 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8dtj" event={"ID":"9b61be23-4db1-4316-a840-1aaff04b664e","Type":"ContainerDied","Data":"cf6613bc3f279ff6dc9247da3dd2e62d9118eecc1be3eca21c67014d997f8bf1"} Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.269088 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8dtj" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.286008 4743 scope.go:117] "RemoveContainer" containerID="b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.310306 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.313933 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wv724"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.326771 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.332674 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bpk8r"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.335951 4743 scope.go:117] "RemoveContainer" containerID="e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.336509 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae\": container with ID starting with e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae not found: ID does not exist" containerID="e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.336545 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae"} err="failed to get container status \"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae\": rpc error: code = NotFound desc = could not find container \"e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae\": container with ID starting with e23689989ec3a71246d088d20d3b38708cf5958859ee7c5db105b7eb708456ae not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.336583 4743 scope.go:117] "RemoveContainer" containerID="44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.337602 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.338123 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57\": container with ID starting with 44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57 not found: ID does not exist" containerID="44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.338170 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57"} err="failed to get container status \"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57\": rpc error: code = NotFound desc = could not find container \"44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57\": container with ID starting with 44969f48526c216e4e5a776650fd0c12643c853832082371e5a795f9d05bac57 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.338199 4743 scope.go:117] "RemoveContainer" containerID="b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.339368 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2\": container with ID starting with b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2 not found: ID does not exist" containerID="b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.339392 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2"} err="failed to get container status \"b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2\": rpc error: code = NotFound desc = could not find container \"b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2\": container with ID starting with b31162247bc35d0410fca82d8e557e75789bc46b0e63796578eae8bebf428ed2 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.339408 4743 scope.go:117] "RemoveContainer" containerID="448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.347843 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqtsl"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.357125 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.358027 4743 scope.go:117] "RemoveContainer" containerID="448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.358372 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c\": container with ID starting with 448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c not found: ID does not exist" containerID="448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.358428 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c"} err="failed to get container status \"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c\": rpc error: code = NotFound desc = could not find container \"448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c\": container with ID starting with 448aaf6d03c3a796a48f3e69b960acd11e9a7c7a814dcefbc4d34fe480f3a41c not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.358455 4743 scope.go:117] "RemoveContainer" containerID="79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.361097 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjjf5"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.364406 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.368433 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8dtj"] Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.372295 4743 scope.go:117] "RemoveContainer" containerID="daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.387879 4743 scope.go:117] "RemoveContainer" containerID="660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.408648 4743 scope.go:117] "RemoveContainer" containerID="79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.409196 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7\": container with ID starting with 79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7 not found: ID does not exist" containerID="79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409227 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7"} err="failed to get container status \"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7\": rpc error: code = NotFound desc = could not find container \"79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7\": container with ID starting with 79ef234f4083f3ab861db4363f2a1c3087be779c063eae70cc3bfd40331bbdf7 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409255 4743 scope.go:117] "RemoveContainer" containerID="daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.409602 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7\": container with ID starting with daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7 not found: ID does not exist" containerID="daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409619 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7"} err="failed to get container status \"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7\": rpc error: code = NotFound desc = could not find container \"daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7\": container with ID starting with daec478d4db742250ac5a25a830440287c1522fa9984f5ce20aa817186173bc7 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409635 4743 scope.go:117] "RemoveContainer" containerID="660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.409914 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8\": container with ID starting with 660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8 not found: ID does not exist" containerID="660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409934 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8"} err="failed to get container status \"660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8\": rpc error: code = NotFound desc = could not find container \"660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8\": container with ID starting with 660a2d34c0d590911d2758ee8e0940bba8a334386e3c5bf766958dc9c7f6a6d8 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.409985 4743 scope.go:117] "RemoveContainer" containerID="bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.454609 4743 scope.go:117] "RemoveContainer" containerID="87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.472440 4743 scope.go:117] "RemoveContainer" containerID="769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.490062 4743 scope.go:117] "RemoveContainer" containerID="bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.490496 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3\": container with ID starting with bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3 not found: ID does not exist" containerID="bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.490540 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3"} err="failed to get container status \"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3\": rpc error: code = NotFound desc = could not find container \"bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3\": container with ID starting with bfa90da4be9ea93116d5370d369de791a849b41aa07876f53706ba7b7b6104d3 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.490569 4743 scope.go:117] "RemoveContainer" containerID="87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.490895 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346\": container with ID starting with 87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346 not found: ID does not exist" containerID="87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.490917 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346"} err="failed to get container status \"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346\": rpc error: code = NotFound desc = could not find container \"87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346\": container with ID starting with 87f5477118c7f54202f9a5aaffae47d6481ae51bd91d8b7024aab19cfd608346 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.490972 4743 scope.go:117] "RemoveContainer" containerID="769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.491176 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066\": container with ID starting with 769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066 not found: ID does not exist" containerID="769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.491199 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066"} err="failed to get container status \"769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066\": rpc error: code = NotFound desc = could not find container \"769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066\": container with ID starting with 769a7722b947988dc951ecb832dbc39381fb0227e5b0a1a5cbd7bd0b134dc066 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.491211 4743 scope.go:117] "RemoveContainer" containerID="b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.506077 4743 scope.go:117] "RemoveContainer" containerID="3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.519573 4743 scope.go:117] "RemoveContainer" containerID="708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.533098 4743 scope.go:117] "RemoveContainer" containerID="b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.534182 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197\": container with ID starting with b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197 not found: ID does not exist" containerID="b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.534296 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197"} err="failed to get container status \"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197\": rpc error: code = NotFound desc = could not find container \"b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197\": container with ID starting with b37aeab634388c837bad5b241de7fe44a78ec8b7355b3b3329f1ffe2961e9197 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.534358 4743 scope.go:117] "RemoveContainer" containerID="3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.534964 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55\": container with ID starting with 3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55 not found: ID does not exist" containerID="3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.535031 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55"} err="failed to get container status \"3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55\": rpc error: code = NotFound desc = could not find container \"3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55\": container with ID starting with 3bf93aa6cd8df69662b96a010a60709ab834ebbde2081eae13007790a4becf55 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.535087 4743 scope.go:117] "RemoveContainer" containerID="708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711" Jan 22 13:50:05 crc kubenswrapper[4743]: E0122 13:50:05.535649 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711\": container with ID starting with 708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711 not found: ID does not exist" containerID="708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.535675 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711"} err="failed to get container status \"708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711\": rpc error: code = NotFound desc = could not find container \"708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711\": container with ID starting with 708794e9f4c91154e4f9aab8e25cf5d017532bdcdbe725852bc144106be62711 not found: ID does not exist" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.753540 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" path="/var/lib/kubelet/pods/6f60519c-a85e-483e-ac46-8cde2dbbd166/volumes" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.754008 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" path="/var/lib/kubelet/pods/9b61be23-4db1-4316-a840-1aaff04b664e/volumes" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.754539 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" path="/var/lib/kubelet/pods/a660baca-6ead-4a0f-959b-24b3badc4a7c/volumes" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.755140 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76ec049-d99a-40be-9fec-f76370769aea" path="/var/lib/kubelet/pods/a76ec049-d99a-40be-9fec-f76370769aea/volumes" Jan 22 13:50:05 crc kubenswrapper[4743]: I0122 13:50:05.755681 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" path="/var/lib/kubelet/pods/c0e6768a-f15d-4daf-9e12-950e7d3b9552/volumes" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.277858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" event={"ID":"ac4d223b-b4ca-485a-aa22-1fbdb0a3228e","Type":"ContainerStarted","Data":"50a5cf48df9e3e39132770161410a7c3182b5d5971b5ba7190d85628f382b7c5"} Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.278078 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.280858 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.300426 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6sdp5" podStartSLOduration=2.300382994 podStartE2EDuration="2.300382994s" podCreationTimestamp="2026-01-22 13:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:50:06.296014314 +0000 UTC m=+242.851057507" watchObservedRunningTime="2026-01-22 13:50:06.300382994 +0000 UTC m=+242.855426167" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsg7v"] Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703707 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703721 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703739 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703751 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703758 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703768 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703775 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703807 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703816 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703824 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703840 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703897 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703909 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703916 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703924 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703931 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703945 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703952 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="extract-utilities" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703962 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703969 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703980 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.703987 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: E0122 13:50:06.703996 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704003 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="extract-content" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704110 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f60519c-a85e-483e-ac46-8cde2dbbd166" containerName="marketplace-operator" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704122 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e6768a-f15d-4daf-9e12-950e7d3b9552" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704132 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660baca-6ead-4a0f-959b-24b3badc4a7c" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704143 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76ec049-d99a-40be-9fec-f76370769aea" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.704156 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b61be23-4db1-4316-a840-1aaff04b664e" containerName="registry-server" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.705353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.707370 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.714011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsg7v"] Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.875466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvfw\" (UniqueName: \"kubernetes.io/projected/edde6004-a1f7-4818-b66a-02137d1f3749-kube-api-access-xvvfw\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.875536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-catalog-content\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.875560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-utilities\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.900246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-slrnr"] Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.901331 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.903960 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.914914 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slrnr"] Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-catalog-content\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-utilities\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-utilities\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmlt8\" (UniqueName: \"kubernetes.io/projected/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-kube-api-access-cmlt8\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-catalog-content\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-catalog-content\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.977944 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edde6004-a1f7-4818-b66a-02137d1f3749-utilities\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.978046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvvfw\" (UniqueName: \"kubernetes.io/projected/edde6004-a1f7-4818-b66a-02137d1f3749-kube-api-access-xvvfw\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:06 crc kubenswrapper[4743]: I0122 13:50:06.997297 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvvfw\" (UniqueName: \"kubernetes.io/projected/edde6004-a1f7-4818-b66a-02137d1f3749-kube-api-access-xvvfw\") pod \"certified-operators-vsg7v\" (UID: \"edde6004-a1f7-4818-b66a-02137d1f3749\") " pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.020862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.079032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-utilities\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.079085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmlt8\" (UniqueName: \"kubernetes.io/projected/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-kube-api-access-cmlt8\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.079165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-catalog-content\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.080371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-catalog-content\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.080625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-utilities\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.102033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmlt8\" (UniqueName: \"kubernetes.io/projected/d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472-kube-api-access-cmlt8\") pod \"community-operators-slrnr\" (UID: \"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472\") " pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.223325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.402762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsg7v"] Jan 22 13:50:07 crc kubenswrapper[4743]: W0122 13:50:07.410477 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedde6004_a1f7_4818_b66a_02137d1f3749.slice/crio-bc67eb9296f6e16917d3e33a89543d0f70950423fc3aaeffb41765686982908a WatchSource:0}: Error finding container bc67eb9296f6e16917d3e33a89543d0f70950423fc3aaeffb41765686982908a: Status 404 returned error can't find the container with id bc67eb9296f6e16917d3e33a89543d0f70950423fc3aaeffb41765686982908a Jan 22 13:50:07 crc kubenswrapper[4743]: I0122 13:50:07.603069 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slrnr"] Jan 22 13:50:07 crc kubenswrapper[4743]: W0122 13:50:07.610669 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37a61b7_0edd_4c4d_8fe5_f1cc4f9ba472.slice/crio-db94eac009faae1f2c1b1ea4936698062254296554f14902bfb1201d938ac614 WatchSource:0}: Error finding container db94eac009faae1f2c1b1ea4936698062254296554f14902bfb1201d938ac614: Status 404 returned error can't find the container with id db94eac009faae1f2c1b1ea4936698062254296554f14902bfb1201d938ac614 Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.293413 4743 generic.go:334] "Generic (PLEG): container finished" podID="edde6004-a1f7-4818-b66a-02137d1f3749" containerID="87a66a93abce709f55104db4d67c2b938613dec676ba11f0769a6ade5350e456" exitCode=0 Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.293501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsg7v" event={"ID":"edde6004-a1f7-4818-b66a-02137d1f3749","Type":"ContainerDied","Data":"87a66a93abce709f55104db4d67c2b938613dec676ba11f0769a6ade5350e456"} Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.293532 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsg7v" event={"ID":"edde6004-a1f7-4818-b66a-02137d1f3749","Type":"ContainerStarted","Data":"bc67eb9296f6e16917d3e33a89543d0f70950423fc3aaeffb41765686982908a"} Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.294842 4743 generic.go:334] "Generic (PLEG): container finished" podID="d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472" containerID="f0b6cfd79ddff408dab323022311db9ecad0bde2b62c12289ab9a2ee1d5b03f5" exitCode=0 Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.294905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrnr" event={"ID":"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472","Type":"ContainerDied","Data":"f0b6cfd79ddff408dab323022311db9ecad0bde2b62c12289ab9a2ee1d5b03f5"} Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.294931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrnr" event={"ID":"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472","Type":"ContainerStarted","Data":"db94eac009faae1f2c1b1ea4936698062254296554f14902bfb1201d938ac614"} Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.525027 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.525595 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" podUID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" containerName="route-controller-manager" containerID="cri-o://3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463" gracePeriod=30 Jan 22 13:50:08 crc kubenswrapper[4743]: I0122 13:50:08.939110 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.015493 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8xs5\" (UniqueName: \"kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5\") pod \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.015565 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca\") pod \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.015613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert\") pod \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.015664 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config\") pod \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\" (UID: \"cf9cdaec-7d8f-4958-bc45-bdd843e96b90\") " Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.016257 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf9cdaec-7d8f-4958-bc45-bdd843e96b90" (UID: "cf9cdaec-7d8f-4958-bc45-bdd843e96b90"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.016622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config" (OuterVolumeSpecName: "config") pod "cf9cdaec-7d8f-4958-bc45-bdd843e96b90" (UID: "cf9cdaec-7d8f-4958-bc45-bdd843e96b90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.020430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5" (OuterVolumeSpecName: "kube-api-access-t8xs5") pod "cf9cdaec-7d8f-4958-bc45-bdd843e96b90" (UID: "cf9cdaec-7d8f-4958-bc45-bdd843e96b90"). InnerVolumeSpecName "kube-api-access-t8xs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.020449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf9cdaec-7d8f-4958-bc45-bdd843e96b90" (UID: "cf9cdaec-7d8f-4958-bc45-bdd843e96b90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.110435 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lzgcl"] Jan 22 13:50:09 crc kubenswrapper[4743]: E0122 13:50:09.112826 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" containerName="route-controller-manager" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.112847 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" containerName="route-controller-manager" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.112988 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" containerName="route-controller-manager" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.113986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.116634 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.116660 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.116675 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8xs5\" (UniqueName: \"kubernetes.io/projected/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-kube-api-access-t8xs5\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.116688 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf9cdaec-7d8f-4958-bc45-bdd843e96b90-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.116981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.117327 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzgcl"] Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.217949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-catalog-content\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.218035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-utilities\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.218061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt96h\" (UniqueName: \"kubernetes.io/projected/474b968c-8a07-4c97-b0a8-1595e2e91317-kube-api-access-qt96h\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.301585 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmrjc"] Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.304886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrnr" event={"ID":"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472","Type":"ContainerStarted","Data":"81d9907b089d9dbfb3da5c4323bf1901a1ce909395722e316859014193192c3b"} Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.305045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsg7v" event={"ID":"edde6004-a1f7-4818-b66a-02137d1f3749","Type":"ContainerStarted","Data":"383242d20f39f85881b931588621c7e8ce0396fe06e7a25b68c498b8c4865db8"} Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.305008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.306321 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.306322 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" containerID="3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463" exitCode=0 Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.306347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" event={"ID":"cf9cdaec-7d8f-4958-bc45-bdd843e96b90","Type":"ContainerDied","Data":"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463"} Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.306547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7" event={"ID":"cf9cdaec-7d8f-4958-bc45-bdd843e96b90","Type":"ContainerDied","Data":"f98c446ab971fbc5193c9dc4e70d8e7f0e000694b89f408f6aeecf8e15e55ed1"} Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.306567 4743 scope.go:117] "RemoveContainer" containerID="3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.307609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.315159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmrjc"] Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-utilities\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-catalog-content\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320149 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-catalog-content\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hshmh\" (UniqueName: \"kubernetes.io/projected/49986793-49f7-49ae-bcb8-699016d9d894-kube-api-access-hshmh\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-utilities\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.320256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt96h\" (UniqueName: \"kubernetes.io/projected/474b968c-8a07-4c97-b0a8-1595e2e91317-kube-api-access-qt96h\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.321450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-utilities\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.321701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/474b968c-8a07-4c97-b0a8-1595e2e91317-catalog-content\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.323332 4743 scope.go:117] "RemoveContainer" containerID="3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463" Jan 22 13:50:09 crc kubenswrapper[4743]: E0122 13:50:09.325059 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463\": container with ID starting with 3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463 not found: ID does not exist" containerID="3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.325106 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463"} err="failed to get container status \"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463\": rpc error: code = NotFound desc = could not find container \"3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463\": container with ID starting with 3054f2f8a3740ff06bc7194c48bcf1c1c3c2a9988c678acd923a008ac1680463 not found: ID does not exist" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.340483 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt96h\" (UniqueName: \"kubernetes.io/projected/474b968c-8a07-4c97-b0a8-1595e2e91317-kube-api-access-qt96h\") pod \"redhat-marketplace-lzgcl\" (UID: \"474b968c-8a07-4c97-b0a8-1595e2e91317\") " pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.416030 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.419385 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d794b4758-vkhv7"] Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.421633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hshmh\" (UniqueName: \"kubernetes.io/projected/49986793-49f7-49ae-bcb8-699016d9d894-kube-api-access-hshmh\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.421701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-catalog-content\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.421728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-utilities\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.422331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-utilities\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.422362 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49986793-49f7-49ae-bcb8-699016d9d894-catalog-content\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.439410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hshmh\" (UniqueName: \"kubernetes.io/projected/49986793-49f7-49ae-bcb8-699016d9d894-kube-api-access-hshmh\") pod \"redhat-operators-lmrjc\" (UID: \"49986793-49f7-49ae-bcb8-699016d9d894\") " pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.475487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.704828 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.754003 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9cdaec-7d8f-4958-bc45-bdd843e96b90" path="/var/lib/kubelet/pods/cf9cdaec-7d8f-4958-bc45-bdd843e96b90/volumes" Jan 22 13:50:09 crc kubenswrapper[4743]: I0122 13:50:09.852983 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzgcl"] Jan 22 13:50:09 crc kubenswrapper[4743]: W0122 13:50:09.861597 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474b968c_8a07_4c97_b0a8_1595e2e91317.slice/crio-dfbf69f8834bdb6d09ec7ed10777f67c6ccb707e5f43789aacc2c0726e0cc3c7 WatchSource:0}: Error finding container dfbf69f8834bdb6d09ec7ed10777f67c6ccb707e5f43789aacc2c0726e0cc3c7: Status 404 returned error can't find the container with id dfbf69f8834bdb6d09ec7ed10777f67c6ccb707e5f43789aacc2c0726e0cc3c7 Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.129752 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmrjc"] Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.328522 4743 generic.go:334] "Generic (PLEG): container finished" podID="edde6004-a1f7-4818-b66a-02137d1f3749" containerID="383242d20f39f85881b931588621c7e8ce0396fe06e7a25b68c498b8c4865db8" exitCode=0 Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.328573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsg7v" event={"ID":"edde6004-a1f7-4818-b66a-02137d1f3749","Type":"ContainerDied","Data":"383242d20f39f85881b931588621c7e8ce0396fe06e7a25b68c498b8c4865db8"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.334201 4743 generic.go:334] "Generic (PLEG): container finished" podID="474b968c-8a07-4c97-b0a8-1595e2e91317" containerID="d07a1b126d3736e8562add167d156c5f9cfbda4a6076a6d07f106f8e5de97ef5" exitCode=0 Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.334279 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzgcl" event={"ID":"474b968c-8a07-4c97-b0a8-1595e2e91317","Type":"ContainerDied","Data":"d07a1b126d3736e8562add167d156c5f9cfbda4a6076a6d07f106f8e5de97ef5"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.334304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzgcl" event={"ID":"474b968c-8a07-4c97-b0a8-1595e2e91317","Type":"ContainerStarted","Data":"dfbf69f8834bdb6d09ec7ed10777f67c6ccb707e5f43789aacc2c0726e0cc3c7"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.338216 4743 generic.go:334] "Generic (PLEG): container finished" podID="49986793-49f7-49ae-bcb8-699016d9d894" containerID="06d06aa3f8451900f2ed344635960e8533442e33a095981b2213066c2489cfcb" exitCode=0 Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.338296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrjc" event={"ID":"49986793-49f7-49ae-bcb8-699016d9d894","Type":"ContainerDied","Data":"06d06aa3f8451900f2ed344635960e8533442e33a095981b2213066c2489cfcb"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.338319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrjc" event={"ID":"49986793-49f7-49ae-bcb8-699016d9d894","Type":"ContainerStarted","Data":"e0ea5f429e06f7dff1bac61c71b0f26717c38fe9e4c5a6358251ce7bd299e836"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.347699 4743 generic.go:334] "Generic (PLEG): container finished" podID="d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472" containerID="81d9907b089d9dbfb3da5c4323bf1901a1ce909395722e316859014193192c3b" exitCode=0 Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.347744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrnr" event={"ID":"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472","Type":"ContainerDied","Data":"81d9907b089d9dbfb3da5c4323bf1901a1ce909395722e316859014193192c3b"} Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.510888 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr"] Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.511606 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.514005 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.514394 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.514603 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.514756 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.515642 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.515907 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.528609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr"] Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.637994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-config\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.638224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7kjd\" (UniqueName: \"kubernetes.io/projected/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-kube-api-access-l7kjd\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.638304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-serving-cert\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.638395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-client-ca\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.739281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-config\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.739371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7kjd\" (UniqueName: \"kubernetes.io/projected/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-kube-api-access-l7kjd\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.739399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-serving-cert\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.739431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-client-ca\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.740505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-client-ca\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.740807 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-config\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.745001 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-serving-cert\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.758645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7kjd\" (UniqueName: \"kubernetes.io/projected/e50eb8f2-676f-4ddf-a504-4a7630c2b9d1-kube-api-access-l7kjd\") pod \"route-controller-manager-7779f9c67b-pm8cr\" (UID: \"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1\") " pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:10 crc kubenswrapper[4743]: I0122 13:50:10.828516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.248426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr"] Jan 22 13:50:11 crc kubenswrapper[4743]: W0122 13:50:11.305758 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode50eb8f2_676f_4ddf_a504_4a7630c2b9d1.slice/crio-9c1b78aa0fb5b10c325328a511263a20f2114aa1d7adba237e976543d9f780dc WatchSource:0}: Error finding container 9c1b78aa0fb5b10c325328a511263a20f2114aa1d7adba237e976543d9f780dc: Status 404 returned error can't find the container with id 9c1b78aa0fb5b10c325328a511263a20f2114aa1d7adba237e976543d9f780dc Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.353561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrnr" event={"ID":"d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472","Type":"ContainerStarted","Data":"502fac0c16be9ac7a4cd4a7bf6fcbc6fe5086dfe8efca0c5798e5f9c63caaaea"} Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.355123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" event={"ID":"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1","Type":"ContainerStarted","Data":"9c1b78aa0fb5b10c325328a511263a20f2114aa1d7adba237e976543d9f780dc"} Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.357482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsg7v" event={"ID":"edde6004-a1f7-4818-b66a-02137d1f3749","Type":"ContainerStarted","Data":"0ae01179e39d1f384a25ea70a71caec242122b687a0fdb0c11c1ffdcc88b203c"} Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.359170 4743 generic.go:334] "Generic (PLEG): container finished" podID="474b968c-8a07-4c97-b0a8-1595e2e91317" containerID="161553f61331f9055e3cb23bc9cf8357a22fcba0abf776124c1e46bdcdc97db8" exitCode=0 Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.359192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzgcl" event={"ID":"474b968c-8a07-4c97-b0a8-1595e2e91317","Type":"ContainerDied","Data":"161553f61331f9055e3cb23bc9cf8357a22fcba0abf776124c1e46bdcdc97db8"} Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.368931 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-slrnr" podStartSLOduration=2.856255877 podStartE2EDuration="5.368907211s" podCreationTimestamp="2026-01-22 13:50:06 +0000 UTC" firstStartedPulling="2026-01-22 13:50:08.296295244 +0000 UTC m=+244.851338407" lastFinishedPulling="2026-01-22 13:50:10.808946568 +0000 UTC m=+247.363989741" observedRunningTime="2026-01-22 13:50:11.36849959 +0000 UTC m=+247.923542753" watchObservedRunningTime="2026-01-22 13:50:11.368907211 +0000 UTC m=+247.923950374" Jan 22 13:50:11 crc kubenswrapper[4743]: I0122 13:50:11.402464 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsg7v" podStartSLOduration=2.931235531 podStartE2EDuration="5.402446994s" podCreationTimestamp="2026-01-22 13:50:06 +0000 UTC" firstStartedPulling="2026-01-22 13:50:08.295016809 +0000 UTC m=+244.850059972" lastFinishedPulling="2026-01-22 13:50:10.766228272 +0000 UTC m=+247.321271435" observedRunningTime="2026-01-22 13:50:11.398906057 +0000 UTC m=+247.953949220" watchObservedRunningTime="2026-01-22 13:50:11.402446994 +0000 UTC m=+247.957490157" Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.366305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzgcl" event={"ID":"474b968c-8a07-4c97-b0a8-1595e2e91317","Type":"ContainerStarted","Data":"7f24ed469e1908bda29f060c435a794e7a5f7bc254a09da53594087c17efec20"} Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.367765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrjc" event={"ID":"49986793-49f7-49ae-bcb8-699016d9d894","Type":"ContainerStarted","Data":"5a46c0d6fb5ec2e59176af82b54777750fb858a2b76b1589bbe90169d4ba344b"} Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.369216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" event={"ID":"e50eb8f2-676f-4ddf-a504-4a7630c2b9d1","Type":"ContainerStarted","Data":"1e3c6e75186c3bdc08fba4aa0f2abb4aa262a724265f6318dd90cd25474d0ffe"} Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.369589 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.373814 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.391378 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lzgcl" podStartSLOduration=1.637340783 podStartE2EDuration="3.391358505s" podCreationTimestamp="2026-01-22 13:50:09 +0000 UTC" firstStartedPulling="2026-01-22 13:50:10.335579177 +0000 UTC m=+246.890622340" lastFinishedPulling="2026-01-22 13:50:12.089596899 +0000 UTC m=+248.644640062" observedRunningTime="2026-01-22 13:50:12.387564131 +0000 UTC m=+248.942607294" watchObservedRunningTime="2026-01-22 13:50:12.391358505 +0000 UTC m=+248.946401668" Jan 22 13:50:12 crc kubenswrapper[4743]: I0122 13:50:12.408925 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7779f9c67b-pm8cr" podStartSLOduration=4.408907749 podStartE2EDuration="4.408907749s" podCreationTimestamp="2026-01-22 13:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:50:12.408708043 +0000 UTC m=+248.963751206" watchObservedRunningTime="2026-01-22 13:50:12.408907749 +0000 UTC m=+248.963950912" Jan 22 13:50:13 crc kubenswrapper[4743]: I0122 13:50:13.375173 4743 generic.go:334] "Generic (PLEG): container finished" podID="49986793-49f7-49ae-bcb8-699016d9d894" containerID="5a46c0d6fb5ec2e59176af82b54777750fb858a2b76b1589bbe90169d4ba344b" exitCode=0 Jan 22 13:50:13 crc kubenswrapper[4743]: I0122 13:50:13.375223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrjc" event={"ID":"49986793-49f7-49ae-bcb8-699016d9d894","Type":"ContainerDied","Data":"5a46c0d6fb5ec2e59176af82b54777750fb858a2b76b1589bbe90169d4ba344b"} Jan 22 13:50:14 crc kubenswrapper[4743]: I0122 13:50:14.381823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmrjc" event={"ID":"49986793-49f7-49ae-bcb8-699016d9d894","Type":"ContainerStarted","Data":"15a721e85bf680ef2cf22d25b3d5ca135d8c6c7d5a3b66642bfe82e22a5bda3c"} Jan 22 13:50:14 crc kubenswrapper[4743]: I0122 13:50:14.400536 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmrjc" podStartSLOduration=1.909609338 podStartE2EDuration="5.400518229s" podCreationTimestamp="2026-01-22 13:50:09 +0000 UTC" firstStartedPulling="2026-01-22 13:50:10.3516638 +0000 UTC m=+246.906706963" lastFinishedPulling="2026-01-22 13:50:13.842572691 +0000 UTC m=+250.397615854" observedRunningTime="2026-01-22 13:50:14.398043471 +0000 UTC m=+250.953086624" watchObservedRunningTime="2026-01-22 13:50:14.400518229 +0000 UTC m=+250.955561392" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.021513 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.022090 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.068941 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.224129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.226045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.263107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.431712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsg7v" Jan 22 13:50:17 crc kubenswrapper[4743]: I0122 13:50:17.432559 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-slrnr" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.476359 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.477663 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.518488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.705652 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.705973 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:19 crc kubenswrapper[4743]: I0122 13:50:19.743836 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:20 crc kubenswrapper[4743]: I0122 13:50:20.456002 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmrjc" Jan 22 13:50:20 crc kubenswrapper[4743]: I0122 13:50:20.456388 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lzgcl" Jan 22 13:50:48 crc kubenswrapper[4743]: I0122 13:50:48.530354 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:50:48 crc kubenswrapper[4743]: I0122 13:50:48.531082 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" containerName="controller-manager" containerID="cri-o://0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f" gracePeriod=30 Jan 22 13:50:48 crc kubenswrapper[4743]: I0122 13:50:48.907029 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.008439 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6hj\" (UniqueName: \"kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj\") pod \"e2055418-5648-45f2-9b7a-90977d621c4f\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.008514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert\") pod \"e2055418-5648-45f2-9b7a-90977d621c4f\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.008545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles\") pod \"e2055418-5648-45f2-9b7a-90977d621c4f\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.008580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca\") pod \"e2055418-5648-45f2-9b7a-90977d621c4f\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.008692 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config\") pod \"e2055418-5648-45f2-9b7a-90977d621c4f\" (UID: \"e2055418-5648-45f2-9b7a-90977d621c4f\") " Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.009841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "e2055418-5648-45f2-9b7a-90977d621c4f" (UID: "e2055418-5648-45f2-9b7a-90977d621c4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.009856 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e2055418-5648-45f2-9b7a-90977d621c4f" (UID: "e2055418-5648-45f2-9b7a-90977d621c4f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.009919 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config" (OuterVolumeSpecName: "config") pod "e2055418-5648-45f2-9b7a-90977d621c4f" (UID: "e2055418-5648-45f2-9b7a-90977d621c4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.013977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e2055418-5648-45f2-9b7a-90977d621c4f" (UID: "e2055418-5648-45f2-9b7a-90977d621c4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.014264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj" (OuterVolumeSpecName: "kube-api-access-2k6hj") pod "e2055418-5648-45f2-9b7a-90977d621c4f" (UID: "e2055418-5648-45f2-9b7a-90977d621c4f"). InnerVolumeSpecName "kube-api-access-2k6hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.110068 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.110114 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6hj\" (UniqueName: \"kubernetes.io/projected/e2055418-5648-45f2-9b7a-90977d621c4f-kube-api-access-2k6hj\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.110127 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2055418-5648-45f2-9b7a-90977d621c4f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.110136 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.110144 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2055418-5648-45f2-9b7a-90977d621c4f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.559756 4743 generic.go:334] "Generic (PLEG): container finished" podID="e2055418-5648-45f2-9b7a-90977d621c4f" containerID="0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f" exitCode=0 Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.559825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" event={"ID":"e2055418-5648-45f2-9b7a-90977d621c4f","Type":"ContainerDied","Data":"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f"} Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.559855 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" event={"ID":"e2055418-5648-45f2-9b7a-90977d621c4f","Type":"ContainerDied","Data":"583478344b0e263ce4b7c3336634c379eee28f62f7a108e3c98509dadea90bf0"} Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.559874 4743 scope.go:117] "RemoveContainer" containerID="0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.559879 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.582965 4743 scope.go:117] "RemoveContainer" containerID="0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f" Jan 22 13:50:49 crc kubenswrapper[4743]: E0122 13:50:49.583906 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f\": container with ID starting with 0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f not found: ID does not exist" containerID="0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.583952 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f"} err="failed to get container status \"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f\": rpc error: code = NotFound desc = could not find container \"0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f\": container with ID starting with 0fe89211efa2a4dbd1dae01aa0a51f11b257776e628f56e96d6bc8722aae5b0f not found: ID does not exist" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.605674 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.611757 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dfd546d69-mj272"] Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.753663 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" path="/var/lib/kubelet/pods/e2055418-5648-45f2-9b7a-90977d621c4f/volumes" Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.821256 4743 patch_prober.go:28] interesting pod/controller-manager-dfd546d69-mj272 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 13:50:49 crc kubenswrapper[4743]: I0122 13:50:49.821362 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-dfd546d69-mj272" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.538556 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6955bfdb45-pfbr8"] Jan 22 13:50:50 crc kubenswrapper[4743]: E0122 13:50:50.539180 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" containerName="controller-manager" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.539278 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" containerName="controller-manager" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.539516 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2055418-5648-45f2-9b7a-90977d621c4f" containerName="controller-manager" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.540553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.545357 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.545529 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.545656 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.545880 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.548467 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.549135 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.558242 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.562881 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6955bfdb45-pfbr8"] Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.625969 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnmc9\" (UniqueName: \"kubernetes.io/projected/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-kube-api-access-lnmc9\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.626047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-serving-cert\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.626161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-client-ca\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.626197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-proxy-ca-bundles\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.626229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-config\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.726716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-serving-cert\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.726795 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-client-ca\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.726830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-proxy-ca-bundles\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.726857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-config\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.726877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnmc9\" (UniqueName: \"kubernetes.io/projected/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-kube-api-access-lnmc9\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.728015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-client-ca\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.728923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-config\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.729649 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-proxy-ca-bundles\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.733328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-serving-cert\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.744679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnmc9\" (UniqueName: \"kubernetes.io/projected/fb847aa7-6d1e-4daf-90e1-0b9827cbe099-kube-api-access-lnmc9\") pod \"controller-manager-6955bfdb45-pfbr8\" (UID: \"fb847aa7-6d1e-4daf-90e1-0b9827cbe099\") " pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:50 crc kubenswrapper[4743]: I0122 13:50:50.866847 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.044073 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6955bfdb45-pfbr8"] Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.571086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" event={"ID":"fb847aa7-6d1e-4daf-90e1-0b9827cbe099","Type":"ContainerStarted","Data":"1ab405c236c98dea714f6299587e679a7eda90bcca0773ead9efed3ad7f1ede1"} Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.571129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" event={"ID":"fb847aa7-6d1e-4daf-90e1-0b9827cbe099","Type":"ContainerStarted","Data":"6ab5024f632c7b11f312e3bd8de963457e8bc176136811e8844aa950a7de97b6"} Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.571349 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.577974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" Jan 22 13:50:51 crc kubenswrapper[4743]: I0122 13:50:51.611468 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6955bfdb45-pfbr8" podStartSLOduration=3.611443546 podStartE2EDuration="3.611443546s" podCreationTimestamp="2026-01-22 13:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:50:51.590263711 +0000 UTC m=+288.145306874" watchObservedRunningTime="2026-01-22 13:50:51.611443546 +0000 UTC m=+288.166486729" Jan 22 13:51:03 crc kubenswrapper[4743]: I0122 13:51:03.599266 4743 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 22 13:51:30 crc kubenswrapper[4743]: I0122 13:51:30.050020 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:51:30 crc kubenswrapper[4743]: I0122 13:51:30.050635 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:52:00 crc kubenswrapper[4743]: I0122 13:52:00.049656 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:52:00 crc kubenswrapper[4743]: I0122 13:52:00.050237 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:52:30 crc kubenswrapper[4743]: I0122 13:52:30.049519 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:52:30 crc kubenswrapper[4743]: I0122 13:52:30.050164 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:52:30 crc kubenswrapper[4743]: I0122 13:52:30.050257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:52:30 crc kubenswrapper[4743]: I0122 13:52:30.050907 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 13:52:30 crc kubenswrapper[4743]: I0122 13:52:30.050965 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12" gracePeriod=600 Jan 22 13:52:31 crc kubenswrapper[4743]: I0122 13:52:31.095096 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12" exitCode=0 Jan 22 13:52:31 crc kubenswrapper[4743]: I0122 13:52:31.095216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12"} Jan 22 13:52:31 crc kubenswrapper[4743]: I0122 13:52:31.095738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894"} Jan 22 13:52:31 crc kubenswrapper[4743]: I0122 13:52:31.095776 4743 scope.go:117] "RemoveContainer" containerID="0eb4f008bbd0d78e0714bf887f00c966ce6e2b4e9accca387b4a31abb51cd001" Jan 22 13:54:30 crc kubenswrapper[4743]: I0122 13:54:30.049223 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:54:30 crc kubenswrapper[4743]: I0122 13:54:30.050269 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:55:00 crc kubenswrapper[4743]: I0122 13:55:00.048646 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:55:00 crc kubenswrapper[4743]: I0122 13:55:00.049235 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:55:16 crc kubenswrapper[4743]: I0122 13:55:16.887478 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2bfxj"] Jan 22 13:55:16 crc kubenswrapper[4743]: I0122 13:55:16.888840 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-trusted-ca\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011492 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-tls\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-certificates\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srxqp\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-kube-api-access-srxqp\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.011594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.057286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.115232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.115299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-tls\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.115331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-certificates\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.133297 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-certificates\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.133743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.133872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srxqp\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-kube-api-access-srxqp\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.134073 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-registry-tls\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.134128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.134292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-trusted-ca\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.134329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.136081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-trusted-ca\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.263185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.268182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srxqp\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-kube-api-access-srxqp\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.270890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97415e35-c4e3-4a82-a9b9-8b6dff15a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-2bfxj\" (UID: \"97415e35-c4e3-4a82-a9b9-8b6dff15a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.299994 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2bfxj"] Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.505136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.707398 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2bfxj"] Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.976581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" event={"ID":"97415e35-c4e3-4a82-a9b9-8b6dff15a1db","Type":"ContainerStarted","Data":"0ba3a596578033399dfd3fafff70ec3b0e943c70d8825f84d56a99dd77fdf924"} Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.976664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" event={"ID":"97415e35-c4e3-4a82-a9b9-8b6dff15a1db","Type":"ContainerStarted","Data":"90bf40b2139e5bfeb63a9a50ab196b97359bc5d2a6fc63ea50077fcd91beefb2"} Jan 22 13:55:17 crc kubenswrapper[4743]: I0122 13:55:17.978119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:18 crc kubenswrapper[4743]: I0122 13:55:18.003858 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" podStartSLOduration=2.003834943 podStartE2EDuration="2.003834943s" podCreationTimestamp="2026-01-22 13:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:55:17.996934215 +0000 UTC m=+554.551977398" watchObservedRunningTime="2026-01-22 13:55:18.003834943 +0000 UTC m=+554.558878116" Jan 22 13:55:30 crc kubenswrapper[4743]: I0122 13:55:30.049599 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:55:30 crc kubenswrapper[4743]: I0122 13:55:30.050255 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:55:30 crc kubenswrapper[4743]: I0122 13:55:30.050307 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:55:30 crc kubenswrapper[4743]: I0122 13:55:30.050896 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 13:55:30 crc kubenswrapper[4743]: I0122 13:55:30.050956 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894" gracePeriod=600 Jan 22 13:55:31 crc kubenswrapper[4743]: I0122 13:55:31.044628 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894" exitCode=0 Jan 22 13:55:31 crc kubenswrapper[4743]: I0122 13:55:31.044707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894"} Jan 22 13:55:31 crc kubenswrapper[4743]: I0122 13:55:31.045088 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947"} Jan 22 13:55:31 crc kubenswrapper[4743]: I0122 13:55:31.045111 4743 scope.go:117] "RemoveContainer" containerID="234b865e4e7e65822fc01e8c12cd94ff6e833aa4b7477e6eafd95d6e7ee04a12" Jan 22 13:55:37 crc kubenswrapper[4743]: I0122 13:55:37.512427 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2bfxj" Jan 22 13:55:37 crc kubenswrapper[4743]: I0122 13:55:37.557985 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:56:02 crc kubenswrapper[4743]: I0122 13:56:02.597286 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" podUID="7b22abd3-ecba-46ba-a310-99000f911356" containerName="registry" containerID="cri-o://dde2c30ede4b010f13053167044a68e55dea8446738f97179775ef14da947294" gracePeriod=30 Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.248291 4743 generic.go:334] "Generic (PLEG): container finished" podID="7b22abd3-ecba-46ba-a310-99000f911356" containerID="dde2c30ede4b010f13053167044a68e55dea8446738f97179775ef14da947294" exitCode=0 Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.248378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" event={"ID":"7b22abd3-ecba-46ba-a310-99000f911356","Type":"ContainerDied","Data":"dde2c30ede4b010f13053167044a68e55dea8446738f97179775ef14da947294"} Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.457946 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.623641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.623746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.623839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.624078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.624135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.624274 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.624356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.625498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmnql\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql\") pod \"7b22abd3-ecba-46ba-a310-99000f911356\" (UID: \"7b22abd3-ecba-46ba-a310-99000f911356\") " Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.625720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.626308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.631228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.632312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.632844 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.633494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql" (OuterVolumeSpecName: "kube-api-access-cmnql") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "kube-api-access-cmnql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.643152 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.650676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7b22abd3-ecba-46ba-a310-99000f911356" (UID: "7b22abd3-ecba-46ba-a310-99000f911356"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727602 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727657 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b22abd3-ecba-46ba-a310-99000f911356-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727687 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmnql\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-kube-api-access-cmnql\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727712 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727736 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b22abd3-ecba-46ba-a310-99000f911356-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727761 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b22abd3-ecba-46ba-a310-99000f911356-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:03 crc kubenswrapper[4743]: I0122 13:56:03.727906 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b22abd3-ecba-46ba-a310-99000f911356-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:04 crc kubenswrapper[4743]: I0122 13:56:04.110245 4743 scope.go:117] "RemoveContainer" containerID="dde2c30ede4b010f13053167044a68e55dea8446738f97179775ef14da947294" Jan 22 13:56:04 crc kubenswrapper[4743]: I0122 13:56:04.254311 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" Jan 22 13:56:04 crc kubenswrapper[4743]: I0122 13:56:04.254302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-94xwn" event={"ID":"7b22abd3-ecba-46ba-a310-99000f911356","Type":"ContainerDied","Data":"77e7aca735bf6c17be932b405758d626af361bb02029d1c82fd258866e138d69"} Jan 22 13:56:04 crc kubenswrapper[4743]: I0122 13:56:04.298889 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:56:04 crc kubenswrapper[4743]: I0122 13:56:04.303570 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-94xwn"] Jan 22 13:56:05 crc kubenswrapper[4743]: I0122 13:56:05.754237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b22abd3-ecba-46ba-a310-99000f911356" path="/var/lib/kubelet/pods/7b22abd3-ecba-46ba-a310-99000f911356/volumes" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.496912 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7"] Jan 22 13:56:30 crc kubenswrapper[4743]: E0122 13:56:30.497654 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b22abd3-ecba-46ba-a310-99000f911356" containerName="registry" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.497669 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b22abd3-ecba-46ba-a310-99000f911356" containerName="registry" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.497811 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b22abd3-ecba-46ba-a310-99000f911356" containerName="registry" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.498283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.513938 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-db7qx"] Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.514146 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xlmbv" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.515193 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-db7qx" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.515412 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.515687 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.519442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4zgkh" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.533262 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7"] Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.540182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-db7qx"] Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.551774 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zdf5x"] Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.552652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.557043 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lt8v9" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.561986 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zdf5x"] Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.609398 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ggld\" (UniqueName: \"kubernetes.io/projected/760af996-e4d2-4507-9e19-a50aa50ceb8a-kube-api-access-6ggld\") pod \"cert-manager-cainjector-cf98fcc89-s2ln7\" (UID: \"760af996-e4d2-4507-9e19-a50aa50ceb8a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.711001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbzl\" (UniqueName: \"kubernetes.io/projected/aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1-kube-api-access-fjbzl\") pod \"cert-manager-webhook-687f57d79b-zdf5x\" (UID: \"aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.711077 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tn7s\" (UniqueName: \"kubernetes.io/projected/47fd113c-6de2-4ad1-b307-2c9bcbdff0b8-kube-api-access-7tn7s\") pod \"cert-manager-858654f9db-db7qx\" (UID: \"47fd113c-6de2-4ad1-b307-2c9bcbdff0b8\") " pod="cert-manager/cert-manager-858654f9db-db7qx" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.711375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ggld\" (UniqueName: \"kubernetes.io/projected/760af996-e4d2-4507-9e19-a50aa50ceb8a-kube-api-access-6ggld\") pod \"cert-manager-cainjector-cf98fcc89-s2ln7\" (UID: \"760af996-e4d2-4507-9e19-a50aa50ceb8a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.735848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ggld\" (UniqueName: \"kubernetes.io/projected/760af996-e4d2-4507-9e19-a50aa50ceb8a-kube-api-access-6ggld\") pod \"cert-manager-cainjector-cf98fcc89-s2ln7\" (UID: \"760af996-e4d2-4507-9e19-a50aa50ceb8a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.812222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbzl\" (UniqueName: \"kubernetes.io/projected/aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1-kube-api-access-fjbzl\") pod \"cert-manager-webhook-687f57d79b-zdf5x\" (UID: \"aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.812287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tn7s\" (UniqueName: \"kubernetes.io/projected/47fd113c-6de2-4ad1-b307-2c9bcbdff0b8-kube-api-access-7tn7s\") pod \"cert-manager-858654f9db-db7qx\" (UID: \"47fd113c-6de2-4ad1-b307-2c9bcbdff0b8\") " pod="cert-manager/cert-manager-858654f9db-db7qx" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.830270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbzl\" (UniqueName: \"kubernetes.io/projected/aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1-kube-api-access-fjbzl\") pod \"cert-manager-webhook-687f57d79b-zdf5x\" (UID: \"aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.833364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tn7s\" (UniqueName: \"kubernetes.io/projected/47fd113c-6de2-4ad1-b307-2c9bcbdff0b8-kube-api-access-7tn7s\") pod \"cert-manager-858654f9db-db7qx\" (UID: \"47fd113c-6de2-4ad1-b307-2c9bcbdff0b8\") " pod="cert-manager/cert-manager-858654f9db-db7qx" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.840968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.849252 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-db7qx" Jan 22 13:56:30 crc kubenswrapper[4743]: I0122 13:56:30.868262 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.084323 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-db7qx"] Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.089840 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.277105 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7"] Jan 22 13:56:31 crc kubenswrapper[4743]: W0122 13:56:31.277875 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod760af996_e4d2_4507_9e19_a50aa50ceb8a.slice/crio-ea8a9ba1ae64f4de3e66af6ddf0454ad2ef562b86455fb885881d23da462190f WatchSource:0}: Error finding container ea8a9ba1ae64f4de3e66af6ddf0454ad2ef562b86455fb885881d23da462190f: Status 404 returned error can't find the container with id ea8a9ba1ae64f4de3e66af6ddf0454ad2ef562b86455fb885881d23da462190f Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.332859 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zdf5x"] Jan 22 13:56:31 crc kubenswrapper[4743]: W0122 13:56:31.337071 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaccb907_25f5_4992_8a7e_cc8d5bdf3bb1.slice/crio-356e59796e88d3b6e8ef9b8bb5cbd46b01beabf62ac55ba89f33e6d070315447 WatchSource:0}: Error finding container 356e59796e88d3b6e8ef9b8bb5cbd46b01beabf62ac55ba89f33e6d070315447: Status 404 returned error can't find the container with id 356e59796e88d3b6e8ef9b8bb5cbd46b01beabf62ac55ba89f33e6d070315447 Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.410832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" event={"ID":"760af996-e4d2-4507-9e19-a50aa50ceb8a","Type":"ContainerStarted","Data":"ea8a9ba1ae64f4de3e66af6ddf0454ad2ef562b86455fb885881d23da462190f"} Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.412157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" event={"ID":"aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1","Type":"ContainerStarted","Data":"356e59796e88d3b6e8ef9b8bb5cbd46b01beabf62ac55ba89f33e6d070315447"} Jan 22 13:56:31 crc kubenswrapper[4743]: I0122 13:56:31.413017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-db7qx" event={"ID":"47fd113c-6de2-4ad1-b307-2c9bcbdff0b8","Type":"ContainerStarted","Data":"a7276b673c1229ce5dc7050e43e099bf8cf635df4371dfb661bcd0e8fb709aef"} Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.441592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-db7qx" event={"ID":"47fd113c-6de2-4ad1-b307-2c9bcbdff0b8","Type":"ContainerStarted","Data":"78f097901f333916480a0b564c54319548fe5aa5eda7b464b567dff51ef72724"} Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.443410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" event={"ID":"760af996-e4d2-4507-9e19-a50aa50ceb8a","Type":"ContainerStarted","Data":"7ff85bff347e2efc4d84ca2fc4ed98a381c9a01dd79d90895734be5e45b8ef87"} Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.444750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" event={"ID":"aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1","Type":"ContainerStarted","Data":"201aea9a5cd4ad0b67a8c3d29f14d5337adcd131c0bb14ec30ad33c844b55bfa"} Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.444907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.461980 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-db7qx" podStartSLOduration=2.035848559 podStartE2EDuration="6.461953509s" podCreationTimestamp="2026-01-22 13:56:30 +0000 UTC" firstStartedPulling="2026-01-22 13:56:31.089610717 +0000 UTC m=+627.644653870" lastFinishedPulling="2026-01-22 13:56:35.515715657 +0000 UTC m=+632.070758820" observedRunningTime="2026-01-22 13:56:36.456716256 +0000 UTC m=+633.011759419" watchObservedRunningTime="2026-01-22 13:56:36.461953509 +0000 UTC m=+633.016996692" Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.477728 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" podStartSLOduration=2.25435993 podStartE2EDuration="6.477710468s" podCreationTimestamp="2026-01-22 13:56:30 +0000 UTC" firstStartedPulling="2026-01-22 13:56:31.338527995 +0000 UTC m=+627.893571178" lastFinishedPulling="2026-01-22 13:56:35.561878553 +0000 UTC m=+632.116921716" observedRunningTime="2026-01-22 13:56:36.473068111 +0000 UTC m=+633.028111274" watchObservedRunningTime="2026-01-22 13:56:36.477710468 +0000 UTC m=+633.032753641" Jan 22 13:56:36 crc kubenswrapper[4743]: I0122 13:56:36.498951 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2ln7" podStartSLOduration=2.277791356 podStartE2EDuration="6.498927775s" podCreationTimestamp="2026-01-22 13:56:30 +0000 UTC" firstStartedPulling="2026-01-22 13:56:31.280447473 +0000 UTC m=+627.835490656" lastFinishedPulling="2026-01-22 13:56:35.501583922 +0000 UTC m=+632.056627075" observedRunningTime="2026-01-22 13:56:36.490887256 +0000 UTC m=+633.045930419" watchObservedRunningTime="2026-01-22 13:56:36.498927775 +0000 UTC m=+633.053970948" Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.728760 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gcj8q"] Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.730099 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-controller" containerID="cri-o://e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.730295 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="northd" containerID="cri-o://c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.730409 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.730625 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="sbdb" containerID="cri-o://5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.731185 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-acl-logging" containerID="cri-o://0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.730114 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="nbdb" containerID="cri-o://494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.731757 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-node" containerID="cri-o://746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" gracePeriod=30 Jan 22 13:56:39 crc kubenswrapper[4743]: I0122 13:56:39.810782 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovnkube-controller" containerID="cri-o://6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" gracePeriod=30 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.066064 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gcj8q_1504d62a-81aa-4a1d-8fda-ef01376adcaa/ovn-acl-logging/0.log" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.066723 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gcj8q_1504d62a-81aa-4a1d-8fda-ef01376adcaa/ovn-controller/0.log" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.067434 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.121728 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m6hrr"] Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.121950 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovnkube-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.121961 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovnkube-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.121973 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="northd" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.121979 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="northd" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.121988 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="nbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.121994 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="nbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122004 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-node" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122010 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-node" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122023 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122029 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122035 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="sbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122040 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="sbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122048 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kubecfg-setup" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122054 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kubecfg-setup" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122062 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-acl-logging" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122069 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-acl-logging" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.122077 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122083 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122195 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-node" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122208 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovnkube-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122220 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-acl-logging" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122228 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122236 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="sbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122245 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="nbdb" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122254 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="northd" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.122262 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerName="ovn-controller" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.124055 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248351 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248764 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248821 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248855 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248865 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.248924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vjn\" (UniqueName: \"kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249170 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249208 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash" (OuterVolumeSpecName: "host-slash") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249208 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket" (OuterVolumeSpecName: "log-socket") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert\") pod \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\" (UID: \"1504d62a-81aa-4a1d-8fda-ef01376adcaa\") " Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log" (OuterVolumeSpecName: "node-log") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249487 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249610 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xbd\" (UniqueName: \"kubernetes.io/projected/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-kube-api-access-q5xbd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-var-lib-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-bin\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-node-log\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249689 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-slash\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249851 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-log-socket\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.249952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-env-overrides\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-script-lib\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-config\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-systemd-units\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-netd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-kubelet\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250257 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-systemd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-etc-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-ovn\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-netns\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250594 4743 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250618 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250632 4743 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250645 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250715 4743 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-slash\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250738 4743 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-log-socket\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250752 4743 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-node-log\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250764 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250887 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250934 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250957 4743 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250976 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.250996 4743 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.251016 4743 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.251035 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1504d62a-81aa-4a1d-8fda-ef01376adcaa-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.251054 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.251073 4743 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.256055 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.256231 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn" (OuterVolumeSpecName: "kube-api-access-44vjn") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "kube-api-access-44vjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.267315 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1504d62a-81aa-4a1d-8fda-ef01376adcaa" (UID: "1504d62a-81aa-4a1d-8fda-ef01376adcaa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351721 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-netns\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xbd\" (UniqueName: \"kubernetes.io/projected/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-kube-api-access-q5xbd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-var-lib-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-netns\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-bin\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351989 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-var-lib-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.351919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-bin\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-node-log\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-slash\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-env-overrides\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-log-socket\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-script-lib\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-config\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-systemd-units\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-netd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-slash\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-kubelet\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-kubelet\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-systemd-units\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-node-log\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-log-socket\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-cni-netd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-systemd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-etc-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-ovn\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352927 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-systemd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-etc-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.352993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-env-overrides\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-ovn\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-run-openvswitch\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353233 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vjn\" (UniqueName: \"kubernetes.io/projected/1504d62a-81aa-4a1d-8fda-ef01376adcaa-kube-api-access-44vjn\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353270 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1504d62a-81aa-4a1d-8fda-ef01376adcaa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353289 4743 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1504d62a-81aa-4a1d-8fda-ef01376adcaa-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-script-lib\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.353452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovnkube-config\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.356722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.379745 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xbd\" (UniqueName: \"kubernetes.io/projected/b82022d0-4955-418d-9bc0-cdfcdaf32b1f-kube-api-access-q5xbd\") pod \"ovnkube-node-m6hrr\" (UID: \"b82022d0-4955-418d-9bc0-cdfcdaf32b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.440993 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.489060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"5e1b9445e920dfd463e97ae82dad78fd14e20156c393837eee7479361882a2d5"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.491285 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zbb8_1a63ac85-9a00-4381-aa80-3da86d5483aa/kube-multus/0.log" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.491338 4743 generic.go:334] "Generic (PLEG): container finished" podID="1a63ac85-9a00-4381-aa80-3da86d5483aa" containerID="7dd46882286eccccfdb5bf23792e79b22eeb5cdfb9ff66abb5c4990b365a1822" exitCode=2 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.491378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zbb8" event={"ID":"1a63ac85-9a00-4381-aa80-3da86d5483aa","Type":"ContainerDied","Data":"7dd46882286eccccfdb5bf23792e79b22eeb5cdfb9ff66abb5c4990b365a1822"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.492201 4743 scope.go:117] "RemoveContainer" containerID="7dd46882286eccccfdb5bf23792e79b22eeb5cdfb9ff66abb5c4990b365a1822" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.498908 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gcj8q_1504d62a-81aa-4a1d-8fda-ef01376adcaa/ovn-acl-logging/0.log" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.499869 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gcj8q_1504d62a-81aa-4a1d-8fda-ef01376adcaa/ovn-controller/0.log" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500357 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500556 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500460 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500623 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500644 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500658 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500670 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" exitCode=0 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500682 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" exitCode=143 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500695 4743 generic.go:334] "Generic (PLEG): container finished" podID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" exitCode=143 Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500897 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500912 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500923 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500951 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500964 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500974 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500983 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500991 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501000 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501008 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501017 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501026 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501053 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501063 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501072 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501081 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501090 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501099 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501108 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501116 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501126 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gcj8q" event={"ID":"1504d62a-81aa-4a1d-8fda-ef01376adcaa","Type":"ContainerDied","Data":"ad0595f9ab7c6ea02a7417522599a4911280efff04f57fbd2a351a5626af8010"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501154 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501165 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501176 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501185 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501197 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501206 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501217 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501226 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.501236 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.500812 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.531240 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.545900 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gcj8q"] Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.549382 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gcj8q"] Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.564421 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.576496 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.588781 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.602801 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.618084 4743 scope.go:117] "RemoveContainer" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.630874 4743 scope.go:117] "RemoveContainer" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.644116 4743 scope.go:117] "RemoveContainer" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.657810 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.658191 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658231 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} err="failed to get container status \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658259 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.658610 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658631 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} err="failed to get container status \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658646 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.658917 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658953 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} err="failed to get container status \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.658979 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.659280 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659307 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} err="failed to get container status \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659321 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.659656 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659688 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} err="failed to get container status \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659702 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.659927 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659959 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} err="failed to get container status \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.659984 4743 scope.go:117] "RemoveContainer" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.660284 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": container with ID starting with 0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551 not found: ID does not exist" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660311 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} err="failed to get container status \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": rpc error: code = NotFound desc = could not find container \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": container with ID starting with 0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660331 4743 scope.go:117] "RemoveContainer" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.660589 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": container with ID starting with e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa not found: ID does not exist" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660616 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} err="failed to get container status \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": rpc error: code = NotFound desc = could not find container \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": container with ID starting with e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660632 4743 scope.go:117] "RemoveContainer" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: E0122 13:56:40.660929 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": container with ID starting with a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99 not found: ID does not exist" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660956 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} err="failed to get container status \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": rpc error: code = NotFound desc = could not find container \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": container with ID starting with a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.660970 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661151 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} err="failed to get container status \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661178 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661403 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} err="failed to get container status \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661434 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661737 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} err="failed to get container status \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.661768 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662084 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} err="failed to get container status \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662108 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662335 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} err="failed to get container status \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662363 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662559 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} err="failed to get container status \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662581 4743 scope.go:117] "RemoveContainer" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662779 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} err="failed to get container status \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": rpc error: code = NotFound desc = could not find container \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": container with ID starting with 0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.662869 4743 scope.go:117] "RemoveContainer" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663127 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} err="failed to get container status \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": rpc error: code = NotFound desc = could not find container \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": container with ID starting with e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663156 4743 scope.go:117] "RemoveContainer" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663513 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} err="failed to get container status \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": rpc error: code = NotFound desc = could not find container \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": container with ID starting with a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663543 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663753 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} err="failed to get container status \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.663782 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664087 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} err="failed to get container status \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664111 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664413 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} err="failed to get container status \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664449 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664751 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} err="failed to get container status \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.664783 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665080 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} err="failed to get container status \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665105 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665379 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} err="failed to get container status \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665400 4743 scope.go:117] "RemoveContainer" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665661 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} err="failed to get container status \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": rpc error: code = NotFound desc = could not find container \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": container with ID starting with 0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665692 4743 scope.go:117] "RemoveContainer" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665940 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} err="failed to get container status \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": rpc error: code = NotFound desc = could not find container \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": container with ID starting with e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.665967 4743 scope.go:117] "RemoveContainer" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.666183 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} err="failed to get container status \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": rpc error: code = NotFound desc = could not find container \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": container with ID starting with a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.666293 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.666646 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} err="failed to get container status \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.666689 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667057 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} err="failed to get container status \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667078 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667277 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} err="failed to get container status \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667301 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667550 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} err="failed to get container status \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667579 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667895 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} err="failed to get container status \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.667916 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.668252 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} err="failed to get container status \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.668289 4743 scope.go:117] "RemoveContainer" containerID="0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.668626 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551"} err="failed to get container status \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": rpc error: code = NotFound desc = could not find container \"0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551\": container with ID starting with 0e41c9ee9b261451ae9312a5ad5a028818acd69b9c85b7eece4f3c06d2e58551 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.668697 4743 scope.go:117] "RemoveContainer" containerID="e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.668984 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa"} err="failed to get container status \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": rpc error: code = NotFound desc = could not find container \"e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa\": container with ID starting with e031fff834aec6b8b2c7ed47784111b062e0b9601d637eba12860e79e4c3f6fa not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669009 4743 scope.go:117] "RemoveContainer" containerID="a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669220 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99"} err="failed to get container status \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": rpc error: code = NotFound desc = could not find container \"a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99\": container with ID starting with a53ea049b61685f1ee1b8026b5f81690e8b464b57330474a3c2fc12b808c2f99 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669243 4743 scope.go:117] "RemoveContainer" containerID="6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669565 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a"} err="failed to get container status \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": rpc error: code = NotFound desc = could not find container \"6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a\": container with ID starting with 6703ca8c5812f8cf4e820c640dc3da6b96d5035162f1bb4c0563efcd122edd2a not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669584 4743 scope.go:117] "RemoveContainer" containerID="5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669900 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53"} err="failed to get container status \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": rpc error: code = NotFound desc = could not find container \"5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53\": container with ID starting with 5f88ce0bb57df24e738ba0d53bf159b512454030e7215342e0e8179b20e25a53 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.669923 4743 scope.go:117] "RemoveContainer" containerID="494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670176 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489"} err="failed to get container status \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": rpc error: code = NotFound desc = could not find container \"494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489\": container with ID starting with 494c239f2078673913ba0c50c50da35d43c19b0b5ac2dbc1009d58e12517a489 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670198 4743 scope.go:117] "RemoveContainer" containerID="c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670445 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91"} err="failed to get container status \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": rpc error: code = NotFound desc = could not find container \"c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91\": container with ID starting with c06826420179055b6073b3e2c057cc03f5bb5fe835cf28bdaceff76b2e27dd91 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670464 4743 scope.go:117] "RemoveContainer" containerID="33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670741 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f"} err="failed to get container status \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": rpc error: code = NotFound desc = could not find container \"33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f\": container with ID starting with 33045af36991196fc7ef8653955e8c7bf85c1952f56de2c82539d65f0adba74f not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.670762 4743 scope.go:117] "RemoveContainer" containerID="746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.671013 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328"} err="failed to get container status \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": rpc error: code = NotFound desc = could not find container \"746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328\": container with ID starting with 746556e3d4d56ede14ab5d860938e7a65a200f86fc80ff35b0fff11172c8d328 not found: ID does not exist" Jan 22 13:56:40 crc kubenswrapper[4743]: I0122 13:56:40.873376 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zdf5x" Jan 22 13:56:41 crc kubenswrapper[4743]: I0122 13:56:41.509493 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zbb8_1a63ac85-9a00-4381-aa80-3da86d5483aa/kube-multus/0.log" Jan 22 13:56:41 crc kubenswrapper[4743]: I0122 13:56:41.509957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zbb8" event={"ID":"1a63ac85-9a00-4381-aa80-3da86d5483aa","Type":"ContainerStarted","Data":"af9d10815caa916aeccd44285d9a2530d75de78cf1244b2d3b33affa4952c5a1"} Jan 22 13:56:41 crc kubenswrapper[4743]: I0122 13:56:41.512435 4743 generic.go:334] "Generic (PLEG): container finished" podID="b82022d0-4955-418d-9bc0-cdfcdaf32b1f" containerID="1540c903c9614fe76a4f5d86a64340fdb8af5de09ffdb2e69b84049bef7dbb9c" exitCode=0 Jan 22 13:56:41 crc kubenswrapper[4743]: I0122 13:56:41.512468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerDied","Data":"1540c903c9614fe76a4f5d86a64340fdb8af5de09ffdb2e69b84049bef7dbb9c"} Jan 22 13:56:41 crc kubenswrapper[4743]: I0122 13:56:41.754413 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1504d62a-81aa-4a1d-8fda-ef01376adcaa" path="/var/lib/kubelet/pods/1504d62a-81aa-4a1d-8fda-ef01376adcaa/volumes" Jan 22 13:56:42 crc kubenswrapper[4743]: I0122 13:56:42.526408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"2d612f7e658f232a9f122131bb5026746f2e4da2b34de7eb9ef881d7f89c1e3e"} Jan 22 13:56:42 crc kubenswrapper[4743]: I0122 13:56:42.527099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"7f6bc136fe70f138815ef1e8a9fd1ce84d5d378cea4ce94c7da8c44c33b68e1b"} Jan 22 13:56:42 crc kubenswrapper[4743]: I0122 13:56:42.527121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"2b36d525b4f78d4f5a0202a6f6af03df0afcebff2f4a4612109057ca46d5be7b"} Jan 22 13:56:42 crc kubenswrapper[4743]: I0122 13:56:42.527137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"b48f80e3767ad9df21b0c13c54816a0e8bcaeb041780738cf9163c53a709b7ef"} Jan 22 13:56:43 crc kubenswrapper[4743]: I0122 13:56:43.537240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"c42e88bbfa2255edf8b23f7cb38cd1283ed658679f0a555291b97e6b24d3d3ce"} Jan 22 13:56:43 crc kubenswrapper[4743]: I0122 13:56:43.537304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"ce6bc08d9ffe355c69e732ce94354a51c1cc48046d36ece3b13970b0a2e0458d"} Jan 22 13:56:45 crc kubenswrapper[4743]: I0122 13:56:45.552517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"89e8aaf94c3332db63c27adfbe15b34b1712184acba806ab70148557739d31d3"} Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.569897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" event={"ID":"b82022d0-4955-418d-9bc0-cdfcdaf32b1f","Type":"ContainerStarted","Data":"986bf6a1891444024082e13e81dea3175376b0fb5c72bd6b2595932c0888d51f"} Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.571610 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.571656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.571730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.598126 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.610070 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" podStartSLOduration=7.610052984 podStartE2EDuration="7.610052984s" podCreationTimestamp="2026-01-22 13:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:56:47.604273807 +0000 UTC m=+644.159316970" watchObservedRunningTime="2026-01-22 13:56:47.610052984 +0000 UTC m=+644.165096157" Jan 22 13:56:47 crc kubenswrapper[4743]: I0122 13:56:47.617897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:56:55 crc kubenswrapper[4743]: I0122 13:56:55.470248 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 13:57:10 crc kubenswrapper[4743]: I0122 13:57:10.462120 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6hrr" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.060392 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh"] Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.062038 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.068220 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh"] Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.068701 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.177418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.177511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwxtn\" (UniqueName: \"kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.177609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.279535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.279749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.279850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwxtn\" (UniqueName: \"kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.280193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.280558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.313465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwxtn\" (UniqueName: \"kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.413145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.666000 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh"] Jan 22 13:57:25 crc kubenswrapper[4743]: I0122 13:57:25.799280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" event={"ID":"f39512c9-c53b-472c-a7ed-0760797a8601","Type":"ContainerStarted","Data":"85f68dbb66a768ec2804ac0cc4e335b879e3b7c29d8d17f2e269e0501c45afd8"} Jan 22 13:57:26 crc kubenswrapper[4743]: I0122 13:57:26.813194 4743 generic.go:334] "Generic (PLEG): container finished" podID="f39512c9-c53b-472c-a7ed-0760797a8601" containerID="d78ef2f64c37ce1ecec7333b22e9497321a178ebc049fa835dae260dfe0f70d4" exitCode=0 Jan 22 13:57:26 crc kubenswrapper[4743]: I0122 13:57:26.813328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" event={"ID":"f39512c9-c53b-472c-a7ed-0760797a8601","Type":"ContainerDied","Data":"d78ef2f64c37ce1ecec7333b22e9497321a178ebc049fa835dae260dfe0f70d4"} Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.307778 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.309325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.320472 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.412453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2rh\" (UniqueName: \"kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.412746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.412886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.513777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2rh\" (UniqueName: \"kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.513856 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.513887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.514308 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.514414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.534577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2rh\" (UniqueName: \"kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh\") pod \"redhat-operators-jlvj8\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.675087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:27 crc kubenswrapper[4743]: I0122 13:57:27.921191 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:28 crc kubenswrapper[4743]: I0122 13:57:28.831192 4743 generic.go:334] "Generic (PLEG): container finished" podID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerID="97827d1453c178d02c240bfa62ca2d3a8a9be2cd5ebba478b6c30f896c5d1323" exitCode=0 Jan 22 13:57:28 crc kubenswrapper[4743]: I0122 13:57:28.831235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerDied","Data":"97827d1453c178d02c240bfa62ca2d3a8a9be2cd5ebba478b6c30f896c5d1323"} Jan 22 13:57:28 crc kubenswrapper[4743]: I0122 13:57:28.831695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerStarted","Data":"9117c7d847bb8b42a0971d0d0dd8abe8317e3060c77e64827c6278025fe40271"} Jan 22 13:57:28 crc kubenswrapper[4743]: I0122 13:57:28.835126 4743 generic.go:334] "Generic (PLEG): container finished" podID="f39512c9-c53b-472c-a7ed-0760797a8601" containerID="0cee3e16c377969278f1f9dcd43b18cd2611918452f6032c9fe84afbf5897346" exitCode=0 Jan 22 13:57:28 crc kubenswrapper[4743]: I0122 13:57:28.835180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" event={"ID":"f39512c9-c53b-472c-a7ed-0760797a8601","Type":"ContainerDied","Data":"0cee3e16c377969278f1f9dcd43b18cd2611918452f6032c9fe84afbf5897346"} Jan 22 13:57:29 crc kubenswrapper[4743]: I0122 13:57:29.842655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerStarted","Data":"45d4d3eddd7786c469b082137add5fea269c3d7cd334eaa2def5713253336db3"} Jan 22 13:57:29 crc kubenswrapper[4743]: I0122 13:57:29.847221 4743 generic.go:334] "Generic (PLEG): container finished" podID="f39512c9-c53b-472c-a7ed-0760797a8601" containerID="cb24392898698767db63461d60e916c500c7a0429ac8df66d99938c30e68e071" exitCode=0 Jan 22 13:57:29 crc kubenswrapper[4743]: I0122 13:57:29.847262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" event={"ID":"f39512c9-c53b-472c-a7ed-0760797a8601","Type":"ContainerDied","Data":"cb24392898698767db63461d60e916c500c7a0429ac8df66d99938c30e68e071"} Jan 22 13:57:30 crc kubenswrapper[4743]: I0122 13:57:30.049615 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:57:30 crc kubenswrapper[4743]: I0122 13:57:30.049989 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:57:30 crc kubenswrapper[4743]: I0122 13:57:30.853804 4743 generic.go:334] "Generic (PLEG): container finished" podID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerID="45d4d3eddd7786c469b082137add5fea269c3d7cd334eaa2def5713253336db3" exitCode=0 Jan 22 13:57:30 crc kubenswrapper[4743]: I0122 13:57:30.853852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerDied","Data":"45d4d3eddd7786c469b082137add5fea269c3d7cd334eaa2def5713253336db3"} Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.099706 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.259562 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwxtn\" (UniqueName: \"kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn\") pod \"f39512c9-c53b-472c-a7ed-0760797a8601\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.260147 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util\") pod \"f39512c9-c53b-472c-a7ed-0760797a8601\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.260361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle\") pod \"f39512c9-c53b-472c-a7ed-0760797a8601\" (UID: \"f39512c9-c53b-472c-a7ed-0760797a8601\") " Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.261108 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle" (OuterVolumeSpecName: "bundle") pod "f39512c9-c53b-472c-a7ed-0760797a8601" (UID: "f39512c9-c53b-472c-a7ed-0760797a8601"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.261473 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.265617 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn" (OuterVolumeSpecName: "kube-api-access-zwxtn") pod "f39512c9-c53b-472c-a7ed-0760797a8601" (UID: "f39512c9-c53b-472c-a7ed-0760797a8601"). InnerVolumeSpecName "kube-api-access-zwxtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.291609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util" (OuterVolumeSpecName: "util") pod "f39512c9-c53b-472c-a7ed-0760797a8601" (UID: "f39512c9-c53b-472c-a7ed-0760797a8601"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.363475 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwxtn\" (UniqueName: \"kubernetes.io/projected/f39512c9-c53b-472c-a7ed-0760797a8601-kube-api-access-zwxtn\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.363537 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f39512c9-c53b-472c-a7ed-0760797a8601-util\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.861570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerStarted","Data":"5eab84b467ec9b0af6b06d848574fcada38ebed3d935174d04cf3a8ce1565e98"} Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.864046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" event={"ID":"f39512c9-c53b-472c-a7ed-0760797a8601","Type":"ContainerDied","Data":"85f68dbb66a768ec2804ac0cc4e335b879e3b7c29d8d17f2e269e0501c45afd8"} Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.864081 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f68dbb66a768ec2804ac0cc4e335b879e3b7c29d8d17f2e269e0501c45afd8" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.864106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh" Jan 22 13:57:31 crc kubenswrapper[4743]: I0122 13:57:31.883826 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlvj8" podStartSLOduration=2.047710546 podStartE2EDuration="4.883810401s" podCreationTimestamp="2026-01-22 13:57:27 +0000 UTC" firstStartedPulling="2026-01-22 13:57:28.834724981 +0000 UTC m=+685.389768144" lastFinishedPulling="2026-01-22 13:57:31.670824836 +0000 UTC m=+688.225867999" observedRunningTime="2026-01-22 13:57:31.883159123 +0000 UTC m=+688.438202296" watchObservedRunningTime="2026-01-22 13:57:31.883810401 +0000 UTC m=+688.438853564" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.411308 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-f88rf"] Jan 22 13:57:36 crc kubenswrapper[4743]: E0122 13:57:36.411874 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="extract" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.411891 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="extract" Jan 22 13:57:36 crc kubenswrapper[4743]: E0122 13:57:36.411914 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="util" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.411923 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="util" Jan 22 13:57:36 crc kubenswrapper[4743]: E0122 13:57:36.411944 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="pull" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.411952 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="pull" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.412083 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39512c9-c53b-472c-a7ed-0760797a8601" containerName="extract" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.412516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.415630 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tnpvn" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.415719 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.416175 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.438872 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-f88rf"] Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.525519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kpxm\" (UniqueName: \"kubernetes.io/projected/60d8bf9b-3641-4cd7-a809-0f77d3fae035-kube-api-access-2kpxm\") pod \"nmstate-operator-646758c888-f88rf\" (UID: \"60d8bf9b-3641-4cd7-a809-0f77d3fae035\") " pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.627281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kpxm\" (UniqueName: \"kubernetes.io/projected/60d8bf9b-3641-4cd7-a809-0f77d3fae035-kube-api-access-2kpxm\") pod \"nmstate-operator-646758c888-f88rf\" (UID: \"60d8bf9b-3641-4cd7-a809-0f77d3fae035\") " pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.659104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kpxm\" (UniqueName: \"kubernetes.io/projected/60d8bf9b-3641-4cd7-a809-0f77d3fae035-kube-api-access-2kpxm\") pod \"nmstate-operator-646758c888-f88rf\" (UID: \"60d8bf9b-3641-4cd7-a809-0f77d3fae035\") " pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.728246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" Jan 22 13:57:36 crc kubenswrapper[4743]: I0122 13:57:36.940500 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-f88rf"] Jan 22 13:57:37 crc kubenswrapper[4743]: I0122 13:57:37.675649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:37 crc kubenswrapper[4743]: I0122 13:57:37.677424 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:37 crc kubenswrapper[4743]: I0122 13:57:37.714476 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:37 crc kubenswrapper[4743]: I0122 13:57:37.898197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" event={"ID":"60d8bf9b-3641-4cd7-a809-0f77d3fae035","Type":"ContainerStarted","Data":"ed348c13e24b11d229151c7577e18ad2a366e05325ab0b1401f682d7d207f052"} Jan 22 13:57:37 crc kubenswrapper[4743]: I0122 13:57:37.971106 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:39 crc kubenswrapper[4743]: I0122 13:57:39.910943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" event={"ID":"60d8bf9b-3641-4cd7-a809-0f77d3fae035","Type":"ContainerStarted","Data":"ae11442957830a3e908555c7ec480ee17f536478c50d96d294bc8378bea8066a"} Jan 22 13:57:39 crc kubenswrapper[4743]: I0122 13:57:39.930111 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-f88rf" podStartSLOduration=1.969479358 podStartE2EDuration="3.93008572s" podCreationTimestamp="2026-01-22 13:57:36 +0000 UTC" firstStartedPulling="2026-01-22 13:57:36.940739153 +0000 UTC m=+693.495782336" lastFinishedPulling="2026-01-22 13:57:38.901345515 +0000 UTC m=+695.456388698" observedRunningTime="2026-01-22 13:57:39.927962314 +0000 UTC m=+696.483005517" watchObservedRunningTime="2026-01-22 13:57:39.93008572 +0000 UTC m=+696.485128923" Jan 22 13:57:40 crc kubenswrapper[4743]: I0122 13:57:40.499371 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:40 crc kubenswrapper[4743]: I0122 13:57:40.915844 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlvj8" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="registry-server" containerID="cri-o://5eab84b467ec9b0af6b06d848574fcada38ebed3d935174d04cf3a8ce1565e98" gracePeriod=2 Jan 22 13:57:43 crc kubenswrapper[4743]: I0122 13:57:43.939781 4743 generic.go:334] "Generic (PLEG): container finished" podID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerID="5eab84b467ec9b0af6b06d848574fcada38ebed3d935174d04cf3a8ce1565e98" exitCode=0 Jan 22 13:57:43 crc kubenswrapper[4743]: I0122 13:57:43.939870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerDied","Data":"5eab84b467ec9b0af6b06d848574fcada38ebed3d935174d04cf3a8ce1565e98"} Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.061953 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.231216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2rh\" (UniqueName: \"kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh\") pod \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.231356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content\") pod \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.231452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities\") pod \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\" (UID: \"af31fd32-8b52-4c33-a037-6cb21ffc0a89\") " Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.233204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities" (OuterVolumeSpecName: "utilities") pod "af31fd32-8b52-4c33-a037-6cb21ffc0a89" (UID: "af31fd32-8b52-4c33-a037-6cb21ffc0a89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.246112 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh" (OuterVolumeSpecName: "kube-api-access-qg2rh") pod "af31fd32-8b52-4c33-a037-6cb21ffc0a89" (UID: "af31fd32-8b52-4c33-a037-6cb21ffc0a89"). InnerVolumeSpecName "kube-api-access-qg2rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.334618 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2rh\" (UniqueName: \"kubernetes.io/projected/af31fd32-8b52-4c33-a037-6cb21ffc0a89-kube-api-access-qg2rh\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.334678 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.402629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af31fd32-8b52-4c33-a037-6cb21ffc0a89" (UID: "af31fd32-8b52-4c33-a037-6cb21ffc0a89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.436152 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af31fd32-8b52-4c33-a037-6cb21ffc0a89-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.955635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvj8" event={"ID":"af31fd32-8b52-4c33-a037-6cb21ffc0a89","Type":"ContainerDied","Data":"9117c7d847bb8b42a0971d0d0dd8abe8317e3060c77e64827c6278025fe40271"} Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.956061 4743 scope.go:117] "RemoveContainer" containerID="5eab84b467ec9b0af6b06d848574fcada38ebed3d935174d04cf3a8ce1565e98" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.955770 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvj8" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.974965 4743 scope.go:117] "RemoveContainer" containerID="45d4d3eddd7786c469b082137add5fea269c3d7cd334eaa2def5713253336db3" Jan 22 13:57:44 crc kubenswrapper[4743]: I0122 13:57:44.999061 4743 scope.go:117] "RemoveContainer" containerID="97827d1453c178d02c240bfa62ca2d3a8a9be2cd5ebba478b6c30f896c5d1323" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.030652 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.037975 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlvj8"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.753659 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" path="/var/lib/kubelet/pods/af31fd32-8b52-4c33-a037-6cb21ffc0a89/volumes" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.794252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5kgz5"] Jan 22 13:57:45 crc kubenswrapper[4743]: E0122 13:57:45.794685 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="extract-utilities" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.794773 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="extract-utilities" Jan 22 13:57:45 crc kubenswrapper[4743]: E0122 13:57:45.794894 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="extract-content" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.794974 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="extract-content" Jan 22 13:57:45 crc kubenswrapper[4743]: E0122 13:57:45.795064 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="registry-server" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.795140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="registry-server" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.795371 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af31fd32-8b52-4c33-a037-6cb21ffc0a89" containerName="registry-server" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.796214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.798181 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qz7x7" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.807020 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.807807 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.811685 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.822866 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5kgz5"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.826453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.847172 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wpxjd"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.849018 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.922844 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf"] Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.923552 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.925060 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7hjss" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.925580 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.925968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szb7\" (UniqueName: \"kubernetes.io/projected/33f98d3b-f0ea-45dd-8fca-d942067e31ad-kube-api-access-9szb7\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955709 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-nmstate-lock\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvf4q\" (UniqueName: \"kubernetes.io/projected/07e19a00-064c-401a-9c0c-4acd067e4e9e-kube-api-access-pvf4q\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-dbus-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33f98d3b-f0ea-45dd-8fca-d942067e31ad-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxbvc\" (UniqueName: \"kubernetes.io/projected/f50a3b80-43ad-46a0-b124-0249185f922b-kube-api-access-dxbvc\") pod \"nmstate-metrics-54757c584b-5kgz5\" (UID: \"f50a3b80-43ad-46a0-b124-0249185f922b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.955987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-ovs-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:45 crc kubenswrapper[4743]: I0122 13:57:45.991407 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf"] Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzkjn\" (UniqueName: \"kubernetes.io/projected/27e7180c-e024-4412-9840-ddeb074d70c8-kube-api-access-gzkjn\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057808 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvf4q\" (UniqueName: \"kubernetes.io/projected/07e19a00-064c-401a-9c0c-4acd067e4e9e-kube-api-access-pvf4q\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-dbus-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33f98d3b-f0ea-45dd-8fca-d942067e31ad-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxbvc\" (UniqueName: \"kubernetes.io/projected/f50a3b80-43ad-46a0-b124-0249185f922b-kube-api-access-dxbvc\") pod \"nmstate-metrics-54757c584b-5kgz5\" (UID: \"f50a3b80-43ad-46a0-b124-0249185f922b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-ovs-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.057986 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27e7180c-e024-4412-9840-ddeb074d70c8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szb7\" (UniqueName: \"kubernetes.io/projected/33f98d3b-f0ea-45dd-8fca-d942067e31ad-kube-api-access-9szb7\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-nmstate-lock\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e7180c-e024-4412-9840-ddeb074d70c8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-ovs-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-dbus-socket\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.058319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/07e19a00-064c-401a-9c0c-4acd067e4e9e-nmstate-lock\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.063479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33f98d3b-f0ea-45dd-8fca-d942067e31ad-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.080280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxbvc\" (UniqueName: \"kubernetes.io/projected/f50a3b80-43ad-46a0-b124-0249185f922b-kube-api-access-dxbvc\") pod \"nmstate-metrics-54757c584b-5kgz5\" (UID: \"f50a3b80-43ad-46a0-b124-0249185f922b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.082194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvf4q\" (UniqueName: \"kubernetes.io/projected/07e19a00-064c-401a-9c0c-4acd067e4e9e-kube-api-access-pvf4q\") pod \"nmstate-handler-wpxjd\" (UID: \"07e19a00-064c-401a-9c0c-4acd067e4e9e\") " pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.086370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szb7\" (UniqueName: \"kubernetes.io/projected/33f98d3b-f0ea-45dd-8fca-d942067e31ad-kube-api-access-9szb7\") pod \"nmstate-webhook-8474b5b9d8-mtdch\" (UID: \"33f98d3b-f0ea-45dd-8fca-d942067e31ad\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.117194 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b59f9678-wpc59"] Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.117910 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.118286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.127632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.135677 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b59f9678-wpc59"] Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.160780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e7180c-e024-4412-9840-ddeb074d70c8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.160871 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzkjn\" (UniqueName: \"kubernetes.io/projected/27e7180c-e024-4412-9840-ddeb074d70c8-kube-api-access-gzkjn\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.160921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27e7180c-e024-4412-9840-ddeb074d70c8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.161935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27e7180c-e024-4412-9840-ddeb074d70c8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.169431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.171487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e7180c-e024-4412-9840-ddeb074d70c8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.182779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzkjn\" (UniqueName: \"kubernetes.io/projected/27e7180c-e024-4412-9840-ddeb074d70c8-kube-api-access-gzkjn\") pod \"nmstate-console-plugin-7754f76f8b-pvbjf\" (UID: \"27e7180c-e024-4412-9840-ddeb074d70c8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.261959 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkm5\" (UniqueName: \"kubernetes.io/projected/088c22e6-e022-46e2-96f4-1df266647ca5-kube-api-access-7nkm5\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-service-ca\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-console-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-oauth-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-trusted-ca-bundle\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.262430 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-oauth-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.272720 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.327813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5kgz5"] Jan 22 13:57:46 crc kubenswrapper[4743]: W0122 13:57:46.336396 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50a3b80_43ad_46a0_b124_0249185f922b.slice/crio-9cc6f92638830db198f37dd2cb6393607bf80559a6a0e8e7b8c7cf2aa33fe5da WatchSource:0}: Error finding container 9cc6f92638830db198f37dd2cb6393607bf80559a6a0e8e7b8c7cf2aa33fe5da: Status 404 returned error can't find the container with id 9cc6f92638830db198f37dd2cb6393607bf80559a6a0e8e7b8c7cf2aa33fe5da Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-trusted-ca-bundle\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-oauth-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkm5\" (UniqueName: \"kubernetes.io/projected/088c22e6-e022-46e2-96f4-1df266647ca5-kube-api-access-7nkm5\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-service-ca\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-console-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.366457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-oauth-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.369443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-trusted-ca-bundle\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.370680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-oauth-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.370824 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-oauth-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.371250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-service-ca\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.371759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/088c22e6-e022-46e2-96f4-1df266647ca5-console-config\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.378039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/088c22e6-e022-46e2-96f4-1df266647ca5-console-serving-cert\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.381138 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch"] Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.385950 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkm5\" (UniqueName: \"kubernetes.io/projected/088c22e6-e022-46e2-96f4-1df266647ca5-kube-api-access-7nkm5\") pod \"console-77b59f9678-wpc59\" (UID: \"088c22e6-e022-46e2-96f4-1df266647ca5\") " pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.456062 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf"] Jan 22 13:57:46 crc kubenswrapper[4743]: W0122 13:57:46.460390 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e7180c_e024_4412_9840_ddeb074d70c8.slice/crio-b94379870025b5c05c4c1b3e7d7dd4de61379ea5314899500f54ce8136d55851 WatchSource:0}: Error finding container b94379870025b5c05c4c1b3e7d7dd4de61379ea5314899500f54ce8136d55851: Status 404 returned error can't find the container with id b94379870025b5c05c4c1b3e7d7dd4de61379ea5314899500f54ce8136d55851 Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.486384 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.907617 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b59f9678-wpc59"] Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.970943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wpxjd" event={"ID":"07e19a00-064c-401a-9c0c-4acd067e4e9e","Type":"ContainerStarted","Data":"ae563bbce150f24256084f7dd0dd4a2f482b62ab072bc6e4aa4b8ea8ac7f774d"} Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.972687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" event={"ID":"33f98d3b-f0ea-45dd-8fca-d942067e31ad","Type":"ContainerStarted","Data":"f2f2c6a131ca3bcf931c41c9b4746754d0d29471ddf5bdcf1ed002b17f9b439a"} Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.973783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" event={"ID":"27e7180c-e024-4412-9840-ddeb074d70c8","Type":"ContainerStarted","Data":"b94379870025b5c05c4c1b3e7d7dd4de61379ea5314899500f54ce8136d55851"} Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.976393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" event={"ID":"f50a3b80-43ad-46a0-b124-0249185f922b","Type":"ContainerStarted","Data":"9cc6f92638830db198f37dd2cb6393607bf80559a6a0e8e7b8c7cf2aa33fe5da"} Jan 22 13:57:46 crc kubenswrapper[4743]: I0122 13:57:46.977496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b59f9678-wpc59" event={"ID":"088c22e6-e022-46e2-96f4-1df266647ca5","Type":"ContainerStarted","Data":"0ffde96015e1a61820617b59998c0bbee92467e8ed492d044596a92335850ec8"} Jan 22 13:57:47 crc kubenswrapper[4743]: I0122 13:57:47.985077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b59f9678-wpc59" event={"ID":"088c22e6-e022-46e2-96f4-1df266647ca5","Type":"ContainerStarted","Data":"0f9812d3619122281e56fa8d2eee0f5f301dee74ddfbc185eb1f3b22eb204bd7"} Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.007129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" event={"ID":"f50a3b80-43ad-46a0-b124-0249185f922b","Type":"ContainerStarted","Data":"0cd953f67d141ba7d26c622dfd54ada2e3412f66dee192712321eba867529245"} Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.009781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wpxjd" event={"ID":"07e19a00-064c-401a-9c0c-4acd067e4e9e","Type":"ContainerStarted","Data":"76570e4c217ff59a4de96f4a0176a5eafa5e0ff88d2baa07153c0adb2dee7da0"} Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.009997 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.012460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" event={"ID":"33f98d3b-f0ea-45dd-8fca-d942067e31ad","Type":"ContainerStarted","Data":"3b93734e2f3610a6d20d0c233f1a7dc94a98efb21f61fb54b8549b3befc1ecf0"} Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.012522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.013552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" event={"ID":"27e7180c-e024-4412-9840-ddeb074d70c8","Type":"ContainerStarted","Data":"500984fe8883485edada06f2afc5fbec3743183f26867aea9a0c3cb2e2ca6080"} Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.029844 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b59f9678-wpc59" podStartSLOduration=4.029827912 podStartE2EDuration="4.029827912s" podCreationTimestamp="2026-01-22 13:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:57:48.004437153 +0000 UTC m=+704.559480316" watchObservedRunningTime="2026-01-22 13:57:50.029827912 +0000 UTC m=+706.584871075" Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.030505 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wpxjd" podStartSLOduration=2.023393142 podStartE2EDuration="5.03049993s" podCreationTimestamp="2026-01-22 13:57:45 +0000 UTC" firstStartedPulling="2026-01-22 13:57:46.189231844 +0000 UTC m=+702.744275007" lastFinishedPulling="2026-01-22 13:57:49.196338612 +0000 UTC m=+705.751381795" observedRunningTime="2026-01-22 13:57:50.027145591 +0000 UTC m=+706.582188794" watchObservedRunningTime="2026-01-22 13:57:50.03049993 +0000 UTC m=+706.585543093" Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.049997 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" podStartSLOduration=2.240841096 podStartE2EDuration="5.049983057s" podCreationTimestamp="2026-01-22 13:57:45 +0000 UTC" firstStartedPulling="2026-01-22 13:57:46.392335677 +0000 UTC m=+702.947378840" lastFinishedPulling="2026-01-22 13:57:49.201477628 +0000 UTC m=+705.756520801" observedRunningTime="2026-01-22 13:57:50.046974477 +0000 UTC m=+706.602017640" watchObservedRunningTime="2026-01-22 13:57:50.049983057 +0000 UTC m=+706.605026220" Jan 22 13:57:50 crc kubenswrapper[4743]: I0122 13:57:50.063057 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pvbjf" podStartSLOduration=2.334851477 podStartE2EDuration="5.063038773s" podCreationTimestamp="2026-01-22 13:57:45 +0000 UTC" firstStartedPulling="2026-01-22 13:57:46.462579749 +0000 UTC m=+703.017622912" lastFinishedPulling="2026-01-22 13:57:49.190767024 +0000 UTC m=+705.745810208" observedRunningTime="2026-01-22 13:57:50.060870615 +0000 UTC m=+706.615913788" watchObservedRunningTime="2026-01-22 13:57:50.063038773 +0000 UTC m=+706.618081936" Jan 22 13:57:52 crc kubenswrapper[4743]: I0122 13:57:52.034754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" event={"ID":"f50a3b80-43ad-46a0-b124-0249185f922b","Type":"ContainerStarted","Data":"044739338e7645b9351728f806763ee47fa761add7cb220e3f4b4bb39099e2b8"} Jan 22 13:57:52 crc kubenswrapper[4743]: I0122 13:57:52.062069 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-5kgz5" podStartSLOduration=1.789424292 podStartE2EDuration="7.062052662s" podCreationTimestamp="2026-01-22 13:57:45 +0000 UTC" firstStartedPulling="2026-01-22 13:57:46.338660435 +0000 UTC m=+702.893703598" lastFinishedPulling="2026-01-22 13:57:51.611288795 +0000 UTC m=+708.166331968" observedRunningTime="2026-01-22 13:57:52.059326169 +0000 UTC m=+708.614369332" watchObservedRunningTime="2026-01-22 13:57:52.062052662 +0000 UTC m=+708.617095825" Jan 22 13:57:56 crc kubenswrapper[4743]: I0122 13:57:56.198650 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wpxjd" Jan 22 13:57:56 crc kubenswrapper[4743]: I0122 13:57:56.487098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:56 crc kubenswrapper[4743]: I0122 13:57:56.487163 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:56 crc kubenswrapper[4743]: I0122 13:57:56.492897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:57 crc kubenswrapper[4743]: I0122 13:57:57.096554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b59f9678-wpc59" Jan 22 13:57:57 crc kubenswrapper[4743]: I0122 13:57:57.163495 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:58:00 crc kubenswrapper[4743]: I0122 13:58:00.048830 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:58:00 crc kubenswrapper[4743]: I0122 13:58:00.049213 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:58:06 crc kubenswrapper[4743]: I0122 13:58:06.136994 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mtdch" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.540808 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf"] Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.542806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.544457 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.552102 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf"] Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.653876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.653978 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.654012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l58qh\" (UniqueName: \"kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.754622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.754669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l58qh\" (UniqueName: \"kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.754717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.755417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.755468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.779102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l58qh\" (UniqueName: \"kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:18 crc kubenswrapper[4743]: I0122 13:58:18.871463 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:19 crc kubenswrapper[4743]: I0122 13:58:19.065333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf"] Jan 22 13:58:19 crc kubenswrapper[4743]: I0122 13:58:19.219191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" event={"ID":"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a","Type":"ContainerStarted","Data":"089a545e3433690cda3cfa2185f2322b3db7ec1d8c1b9cd72b1778dc20636ad1"} Jan 22 13:58:20 crc kubenswrapper[4743]: I0122 13:58:20.229344 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerID="eb8645053725e0d99bcc9d1ce3f6f3730eecc5f6fce74cbcfce958834706c813" exitCode=0 Jan 22 13:58:20 crc kubenswrapper[4743]: I0122 13:58:20.229898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" event={"ID":"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a","Type":"ContainerDied","Data":"eb8645053725e0d99bcc9d1ce3f6f3730eecc5f6fce74cbcfce958834706c813"} Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.214138 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ln28w" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" containerName="console" containerID="cri-o://b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a" gracePeriod=15 Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.245290 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerID="e0e25d1cfcccd16107c450b807bce387101723badc21c6b97bd05a31fd631fee" exitCode=0 Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.245343 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" event={"ID":"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a","Type":"ContainerDied","Data":"e0e25d1cfcccd16107c450b807bce387101723badc21c6b97bd05a31fd631fee"} Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.612666 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ln28w_a11f3169-f731-464a-a7d4-9dea61d28398/console/0.log" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.612721 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.726839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.727284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.727326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.727366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.727414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.728269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config" (OuterVolumeSpecName: "console-config") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.728325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.728355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca" (OuterVolumeSpecName: "service-ca") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.728518 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2fmd\" (UniqueName: \"kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.729164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert\") pod \"a11f3169-f731-464a-a7d4-9dea61d28398\" (UID: \"a11f3169-f731-464a-a7d4-9dea61d28398\") " Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.729896 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-console-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.729940 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.729965 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.729985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.736237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.736475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd" (OuterVolumeSpecName: "kube-api-access-m2fmd") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "kube-api-access-m2fmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.737480 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a11f3169-f731-464a-a7d4-9dea61d28398" (UID: "a11f3169-f731-464a-a7d4-9dea61d28398"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.834080 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.834128 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a11f3169-f731-464a-a7d4-9dea61d28398-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.834147 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2fmd\" (UniqueName: \"kubernetes.io/projected/a11f3169-f731-464a-a7d4-9dea61d28398-kube-api-access-m2fmd\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:22 crc kubenswrapper[4743]: I0122 13:58:22.834168 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a11f3169-f731-464a-a7d4-9dea61d28398-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257102 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ln28w_a11f3169-f731-464a-a7d4-9dea61d28398/console/0.log" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257169 4743 generic.go:334] "Generic (PLEG): container finished" podID="a11f3169-f731-464a-a7d4-9dea61d28398" containerID="b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a" exitCode=2 Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ln28w" event={"ID":"a11f3169-f731-464a-a7d4-9dea61d28398","Type":"ContainerDied","Data":"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a"} Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ln28w" event={"ID":"a11f3169-f731-464a-a7d4-9dea61d28398","Type":"ContainerDied","Data":"8213d799c7281777c4e0021c20e909025170d0aaf1aebfd52c5242a6b1b6c682"} Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257313 4743 scope.go:117] "RemoveContainer" containerID="b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.257468 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ln28w" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.263331 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerID="abfe0c31d64f484b65cf89ac557c94d8c0e8ec74beec5ae7c37efad2df6861da" exitCode=0 Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.263387 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" event={"ID":"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a","Type":"ContainerDied","Data":"abfe0c31d64f484b65cf89ac557c94d8c0e8ec74beec5ae7c37efad2df6861da"} Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.281495 4743 scope.go:117] "RemoveContainer" containerID="b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a" Jan 22 13:58:23 crc kubenswrapper[4743]: E0122 13:58:23.282514 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a\": container with ID starting with b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a not found: ID does not exist" containerID="b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.282644 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a"} err="failed to get container status \"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a\": rpc error: code = NotFound desc = could not find container \"b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a\": container with ID starting with b1cdde30398ef03672761f0be92fdf53bf7593a92edd373a4cbea7acd4df7d1a not found: ID does not exist" Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.304545 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.310112 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ln28w"] Jan 22 13:58:23 crc kubenswrapper[4743]: I0122 13:58:23.763432 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" path="/var/lib/kubelet/pods/a11f3169-f731-464a-a7d4-9dea61d28398/volumes" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.540416 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.658503 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l58qh\" (UniqueName: \"kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh\") pod \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.658585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle\") pod \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.658620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util\") pod \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\" (UID: \"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a\") " Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.659978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle" (OuterVolumeSpecName: "bundle") pod "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" (UID: "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.664037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh" (OuterVolumeSpecName: "kube-api-access-l58qh") pod "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" (UID: "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a"). InnerVolumeSpecName "kube-api-access-l58qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.671680 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util" (OuterVolumeSpecName: "util") pod "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" (UID: "f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.760209 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l58qh\" (UniqueName: \"kubernetes.io/projected/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-kube-api-access-l58qh\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.760267 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:24 crc kubenswrapper[4743]: I0122 13:58:24.760286 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a-util\") on node \"crc\" DevicePath \"\"" Jan 22 13:58:25 crc kubenswrapper[4743]: I0122 13:58:25.282957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" event={"ID":"f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a","Type":"ContainerDied","Data":"089a545e3433690cda3cfa2185f2322b3db7ec1d8c1b9cd72b1778dc20636ad1"} Jan 22 13:58:25 crc kubenswrapper[4743]: I0122 13:58:25.283043 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089a545e3433690cda3cfa2185f2322b3db7ec1d8c1b9cd72b1778dc20636ad1" Jan 22 13:58:25 crc kubenswrapper[4743]: I0122 13:58:25.283036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf" Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.050081 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.051147 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.051241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.052573 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.052706 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947" gracePeriod=600 Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.322856 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947" exitCode=0 Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.322944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947"} Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.323263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc"} Jan 22 13:58:30 crc kubenswrapper[4743]: I0122 13:58:30.323288 4743 scope.go:117] "RemoveContainer" containerID="4be76e895eacb2fd2b5388927ab0ebc428a547f5a9c9290e40e5eb162b110894" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.526203 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv"] Jan 22 13:58:37 crc kubenswrapper[4743]: E0122 13:58:37.526821 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="pull" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.526858 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="pull" Jan 22 13:58:37 crc kubenswrapper[4743]: E0122 13:58:37.526876 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="util" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527153 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="util" Jan 22 13:58:37 crc kubenswrapper[4743]: E0122 13:58:37.527173 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="extract" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527181 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="extract" Jan 22 13:58:37 crc kubenswrapper[4743]: E0122 13:58:37.527192 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" containerName="console" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527199 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" containerName="console" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527330 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a" containerName="extract" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527348 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11f3169-f731-464a-a7d4-9dea61d28398" containerName="console" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.527829 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.529498 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.529911 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.529998 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.530275 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.530841 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d46n7" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.543399 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv"] Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.642057 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-webhook-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.642180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-apiservice-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.642212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtqr\" (UniqueName: \"kubernetes.io/projected/dcd68957-0356-4eda-a65f-77e770aae844-kube-api-access-jrtqr\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.743641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-webhook-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.743729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-apiservice-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.743754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtqr\" (UniqueName: \"kubernetes.io/projected/dcd68957-0356-4eda-a65f-77e770aae844-kube-api-access-jrtqr\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.752995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-webhook-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.756723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dcd68957-0356-4eda-a65f-77e770aae844-apiservice-cert\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.771980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtqr\" (UniqueName: \"kubernetes.io/projected/dcd68957-0356-4eda-a65f-77e770aae844-kube-api-access-jrtqr\") pod \"metallb-operator-controller-manager-6494f4f8f8-zbvgv\" (UID: \"dcd68957-0356-4eda-a65f-77e770aae844\") " pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.791860 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4"] Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.793110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.795326 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.795503 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.797666 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-rsvn2" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.821296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4"] Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.845732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.945294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmzzp\" (UniqueName: \"kubernetes.io/projected/fe771c71-01ce-4513-bfa8-2393f3f055f2-kube-api-access-wmzzp\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.945405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-apiservice-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:37 crc kubenswrapper[4743]: I0122 13:58:37.945472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-webhook-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.046822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-apiservice-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.047213 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-webhook-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.047239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmzzp\" (UniqueName: \"kubernetes.io/projected/fe771c71-01ce-4513-bfa8-2393f3f055f2-kube-api-access-wmzzp\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.052257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-webhook-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.052451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe771c71-01ce-4513-bfa8-2393f3f055f2-apiservice-cert\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.062437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmzzp\" (UniqueName: \"kubernetes.io/projected/fe771c71-01ce-4513-bfa8-2393f3f055f2-kube-api-access-wmzzp\") pod \"metallb-operator-webhook-server-65d5b677d7-mdls4\" (UID: \"fe771c71-01ce-4513-bfa8-2393f3f055f2\") " pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.121180 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.343020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4"] Jan 22 13:58:38 crc kubenswrapper[4743]: W0122 13:58:38.353051 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe771c71_01ce_4513_bfa8_2393f3f055f2.slice/crio-224a74a1cb6f404ebcb885fa7cf908cb39b987f96e803c9391239028fcbe7bee WatchSource:0}: Error finding container 224a74a1cb6f404ebcb885fa7cf908cb39b987f96e803c9391239028fcbe7bee: Status 404 returned error can't find the container with id 224a74a1cb6f404ebcb885fa7cf908cb39b987f96e803c9391239028fcbe7bee Jan 22 13:58:38 crc kubenswrapper[4743]: W0122 13:58:38.366600 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd68957_0356_4eda_a65f_77e770aae844.slice/crio-d64e82ec33b0f2abb8584323c345d109557486577e471d35c2db393cf37755f9 WatchSource:0}: Error finding container d64e82ec33b0f2abb8584323c345d109557486577e471d35c2db393cf37755f9: Status 404 returned error can't find the container with id d64e82ec33b0f2abb8584323c345d109557486577e471d35c2db393cf37755f9 Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.370991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv"] Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.376734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" event={"ID":"dcd68957-0356-4eda-a65f-77e770aae844","Type":"ContainerStarted","Data":"d64e82ec33b0f2abb8584323c345d109557486577e471d35c2db393cf37755f9"} Jan 22 13:58:38 crc kubenswrapper[4743]: I0122 13:58:38.378243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" event={"ID":"fe771c71-01ce-4513-bfa8-2393f3f055f2","Type":"ContainerStarted","Data":"224a74a1cb6f404ebcb885fa7cf908cb39b987f96e803c9391239028fcbe7bee"} Jan 22 13:58:43 crc kubenswrapper[4743]: I0122 13:58:43.415746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" event={"ID":"fe771c71-01ce-4513-bfa8-2393f3f055f2","Type":"ContainerStarted","Data":"556c42a8950b44ef97607854322eaa213375e1c5120a0f8e781854b26f56fcbd"} Jan 22 13:58:43 crc kubenswrapper[4743]: I0122 13:58:43.418656 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" event={"ID":"dcd68957-0356-4eda-a65f-77e770aae844","Type":"ContainerStarted","Data":"5ebd8deed31da45562f797a3d65e0bb7c8fc83e1a34404f2c49fd3ee8fbb3a2d"} Jan 22 13:58:43 crc kubenswrapper[4743]: I0122 13:58:43.418825 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:58:43 crc kubenswrapper[4743]: I0122 13:58:43.436693 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" podStartSLOduration=1.90088587 podStartE2EDuration="6.436669159s" podCreationTimestamp="2026-01-22 13:58:37 +0000 UTC" firstStartedPulling="2026-01-22 13:58:38.354922519 +0000 UTC m=+754.909965682" lastFinishedPulling="2026-01-22 13:58:42.890705808 +0000 UTC m=+759.445748971" observedRunningTime="2026-01-22 13:58:43.43482851 +0000 UTC m=+759.989871683" watchObservedRunningTime="2026-01-22 13:58:43.436669159 +0000 UTC m=+759.991712342" Jan 22 13:58:43 crc kubenswrapper[4743]: I0122 13:58:43.460817 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" podStartSLOduration=1.960214638 podStartE2EDuration="6.46076489s" podCreationTimestamp="2026-01-22 13:58:37 +0000 UTC" firstStartedPulling="2026-01-22 13:58:38.370620467 +0000 UTC m=+754.925663630" lastFinishedPulling="2026-01-22 13:58:42.871170719 +0000 UTC m=+759.426213882" observedRunningTime="2026-01-22 13:58:43.45735843 +0000 UTC m=+760.012401593" watchObservedRunningTime="2026-01-22 13:58:43.46076489 +0000 UTC m=+760.015808053" Jan 22 13:58:44 crc kubenswrapper[4743]: I0122 13:58:44.424558 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:58:58 crc kubenswrapper[4743]: I0122 13:58:58.132486 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65d5b677d7-mdls4" Jan 22 13:59:17 crc kubenswrapper[4743]: I0122 13:59:17.848650 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6494f4f8f8-zbvgv" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.765996 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-st65g"] Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.769157 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.781274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v"] Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.782450 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.783079 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.794762 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.795076 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-75n6l" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.798931 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.809928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v"] Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.878355 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tl9sw"] Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.879359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tl9sw" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.883599 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.883817 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.883874 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.884087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wsbwk" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics-certs\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906734 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-conf\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906759 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbc5n\" (UniqueName: \"kubernetes.io/projected/fa2773b4-4a56-40e4-a2a9-6188bb40964f-kube-api-access-nbc5n\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bz6\" (UniqueName: \"kubernetes.io/projected/f17b4fff-f244-477f-912d-c2e93321094e-kube-api-access-f5bz6\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f17b4fff-f244-477f-912d-c2e93321094e-metallb-excludel2\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-reloader\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-sockets\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcfct\" (UniqueName: \"kubernetes.io/projected/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-kube-api-access-pcfct\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.906973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-startup\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.907016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.911781 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-4kwcx"] Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.912852 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.914848 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 22 13:59:18 crc kubenswrapper[4743]: I0122 13:59:18.919143 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4kwcx"] Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.007963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-startup\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics-certs\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkfrc\" (UniqueName: \"kubernetes.io/projected/9a99dab2-57e0-4830-8dc7-1bf40627f408-kube-api-access-xkfrc\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008078 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-conf\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-metrics-certs\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbc5n\" (UniqueName: \"kubernetes.io/projected/fa2773b4-4a56-40e4-a2a9-6188bb40964f-kube-api-access-nbc5n\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bz6\" (UniqueName: \"kubernetes.io/projected/f17b4fff-f244-477f-912d-c2e93321094e-kube-api-access-f5bz6\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f17b4fff-f244-477f-912d-c2e93321094e-metallb-excludel2\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-cert\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-reloader\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-sockets\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.008279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcfct\" (UniqueName: \"kubernetes.io/projected/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-kube-api-access-pcfct\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.008297 4743 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.008398 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs podName:f17b4fff-f244-477f-912d-c2e93321094e nodeName:}" failed. No retries permitted until 2026-01-22 13:59:19.508362293 +0000 UTC m=+796.063405496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs") pod "speaker-tl9sw" (UID: "f17b4fff-f244-477f-912d-c2e93321094e") : secret "speaker-certs-secret" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.009473 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.009472 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.009510 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert podName:fa2773b4-4a56-40e4-a2a9-6188bb40964f nodeName:}" failed. No retries permitted until 2026-01-22 13:59:19.509499633 +0000 UTC m=+796.064542796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert") pod "frr-k8s-webhook-server-7df86c4f6c-hmp5v" (UID: "fa2773b4-4a56-40e4-a2a9-6188bb40964f") : secret "frr-k8s-webhook-server-cert" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.009586 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist podName:f17b4fff-f244-477f-912d-c2e93321094e nodeName:}" failed. No retries permitted until 2026-01-22 13:59:19.509567545 +0000 UTC m=+796.064610708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist") pod "speaker-tl9sw" (UID: "f17b4fff-f244-477f-912d-c2e93321094e") : secret "metallb-memberlist" not found Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.009761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-reloader\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.009899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-conf\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.009943 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.009942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-sockets\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.010125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f17b4fff-f244-477f-912d-c2e93321094e-metallb-excludel2\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.011014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-frr-startup\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.022549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-metrics-certs\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.026848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcfct\" (UniqueName: \"kubernetes.io/projected/2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b-kube-api-access-pcfct\") pod \"frr-k8s-st65g\" (UID: \"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b\") " pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.051133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbc5n\" (UniqueName: \"kubernetes.io/projected/fa2773b4-4a56-40e4-a2a9-6188bb40964f-kube-api-access-nbc5n\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.061210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bz6\" (UniqueName: \"kubernetes.io/projected/f17b4fff-f244-477f-912d-c2e93321094e-kube-api-access-f5bz6\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.089114 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.109107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-cert\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.109216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkfrc\" (UniqueName: \"kubernetes.io/projected/9a99dab2-57e0-4830-8dc7-1bf40627f408-kube-api-access-xkfrc\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.109271 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-metrics-certs\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.111403 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.113440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-metrics-certs\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.122514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a99dab2-57e0-4830-8dc7-1bf40627f408-cert\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.128301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkfrc\" (UniqueName: \"kubernetes.io/projected/9a99dab2-57e0-4830-8dc7-1bf40627f408-kube-api-access-xkfrc\") pod \"controller-6968d8fdc4-4kwcx\" (UID: \"9a99dab2-57e0-4830-8dc7-1bf40627f408\") " pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.235735 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.522923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.523343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.523400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.524106 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 22 13:59:19 crc kubenswrapper[4743]: E0122 13:59:19.524159 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist podName:f17b4fff-f244-477f-912d-c2e93321094e nodeName:}" failed. No retries permitted until 2026-01-22 13:59:20.524146351 +0000 UTC m=+797.079189504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist") pod "speaker-tl9sw" (UID: "f17b4fff-f244-477f-912d-c2e93321094e") : secret "metallb-memberlist" not found Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.529207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa2773b4-4a56-40e4-a2a9-6188bb40964f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hmp5v\" (UID: \"fa2773b4-4a56-40e4-a2a9-6188bb40964f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.529664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-metrics-certs\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.632661 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"d196632b529ea430fe81d136712926a68f728215fc1bf2f433c65273671b376a"} Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.632889 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4kwcx"] Jan 22 13:59:19 crc kubenswrapper[4743]: W0122 13:59:19.639318 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a99dab2_57e0_4830_8dc7_1bf40627f408.slice/crio-64d0ecd865860fdedee7b1b63c9f3f8a2fbc43cb68491a5c19a0fa53324ca540 WatchSource:0}: Error finding container 64d0ecd865860fdedee7b1b63c9f3f8a2fbc43cb68491a5c19a0fa53324ca540: Status 404 returned error can't find the container with id 64d0ecd865860fdedee7b1b63c9f3f8a2fbc43cb68491a5c19a0fa53324ca540 Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.700276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:19 crc kubenswrapper[4743]: I0122 13:59:19.953541 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v"] Jan 22 13:59:19 crc kubenswrapper[4743]: W0122 13:59:19.963330 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa2773b4_4a56_40e4_a2a9_6188bb40964f.slice/crio-1c5126d5263fd0806ee69a83300e5332c490fbd5e298a876c82594977a7b985d WatchSource:0}: Error finding container 1c5126d5263fd0806ee69a83300e5332c490fbd5e298a876c82594977a7b985d: Status 404 returned error can't find the container with id 1c5126d5263fd0806ee69a83300e5332c490fbd5e298a876c82594977a7b985d Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.534448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.539961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f17b4fff-f244-477f-912d-c2e93321094e-memberlist\") pod \"speaker-tl9sw\" (UID: \"f17b4fff-f244-477f-912d-c2e93321094e\") " pod="metallb-system/speaker-tl9sw" Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.641976 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" event={"ID":"fa2773b4-4a56-40e4-a2a9-6188bb40964f","Type":"ContainerStarted","Data":"1c5126d5263fd0806ee69a83300e5332c490fbd5e298a876c82594977a7b985d"} Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.644502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4kwcx" event={"ID":"9a99dab2-57e0-4830-8dc7-1bf40627f408","Type":"ContainerStarted","Data":"fd77b471db128d74a8e0a5e73579ffcc226781fbd326b866d0cda7d52738877a"} Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.644549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4kwcx" event={"ID":"9a99dab2-57e0-4830-8dc7-1bf40627f408","Type":"ContainerStarted","Data":"333ac14b30bed729af3f2a39ae0de1b1eddd32b441f87c4dc37ac7a8cbd7d9ef"} Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.644563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4kwcx" event={"ID":"9a99dab2-57e0-4830-8dc7-1bf40627f408","Type":"ContainerStarted","Data":"64d0ecd865860fdedee7b1b63c9f3f8a2fbc43cb68491a5c19a0fa53324ca540"} Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.644624 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.667273 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-4kwcx" podStartSLOduration=2.667252844 podStartE2EDuration="2.667252844s" podCreationTimestamp="2026-01-22 13:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:59:20.662538358 +0000 UTC m=+797.217581521" watchObservedRunningTime="2026-01-22 13:59:20.667252844 +0000 UTC m=+797.222296007" Jan 22 13:59:20 crc kubenswrapper[4743]: I0122 13:59:20.697489 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tl9sw" Jan 22 13:59:20 crc kubenswrapper[4743]: W0122 13:59:20.741100 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17b4fff_f244_477f_912d_c2e93321094e.slice/crio-b3b69f14e69ce5f5a91361c8ad5615a3ff655d3f637a6698867c7eb318841ff7 WatchSource:0}: Error finding container b3b69f14e69ce5f5a91361c8ad5615a3ff655d3f637a6698867c7eb318841ff7: Status 404 returned error can't find the container with id b3b69f14e69ce5f5a91361c8ad5615a3ff655d3f637a6698867c7eb318841ff7 Jan 22 13:59:21 crc kubenswrapper[4743]: I0122 13:59:21.656097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl9sw" event={"ID":"f17b4fff-f244-477f-912d-c2e93321094e","Type":"ContainerStarted","Data":"e1d7118c8ea0ab725c944ecb9a1305700c7459618432513009caaf05b63d1fe3"} Jan 22 13:59:21 crc kubenswrapper[4743]: I0122 13:59:21.656483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl9sw" event={"ID":"f17b4fff-f244-477f-912d-c2e93321094e","Type":"ContainerStarted","Data":"f3eb7552d1b68f637034339f002f009adba1ab126ef721ac55408d841e987eca"} Jan 22 13:59:21 crc kubenswrapper[4743]: I0122 13:59:21.656500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl9sw" event={"ID":"f17b4fff-f244-477f-912d-c2e93321094e","Type":"ContainerStarted","Data":"b3b69f14e69ce5f5a91361c8ad5615a3ff655d3f637a6698867c7eb318841ff7"} Jan 22 13:59:21 crc kubenswrapper[4743]: I0122 13:59:21.656758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tl9sw" Jan 22 13:59:21 crc kubenswrapper[4743]: I0122 13:59:21.676682 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tl9sw" podStartSLOduration=3.676661191 podStartE2EDuration="3.676661191s" podCreationTimestamp="2026-01-22 13:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 13:59:21.674899745 +0000 UTC m=+798.229942918" watchObservedRunningTime="2026-01-22 13:59:21.676661191 +0000 UTC m=+798.231704354" Jan 22 13:59:27 crc kubenswrapper[4743]: I0122 13:59:27.701845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" event={"ID":"fa2773b4-4a56-40e4-a2a9-6188bb40964f","Type":"ContainerStarted","Data":"975742bd14606592bbd39e3d337d76ec97d0727082494cc1397f572983fdb918"} Jan 22 13:59:27 crc kubenswrapper[4743]: I0122 13:59:27.702715 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:27 crc kubenswrapper[4743]: I0122 13:59:27.703952 4743 generic.go:334] "Generic (PLEG): container finished" podID="2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b" containerID="9b24ad2c54cc70bae6ad9013ffdea0b3c86eae4c26cb5754140047482434d3f2" exitCode=0 Jan 22 13:59:27 crc kubenswrapper[4743]: I0122 13:59:27.703983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerDied","Data":"9b24ad2c54cc70bae6ad9013ffdea0b3c86eae4c26cb5754140047482434d3f2"} Jan 22 13:59:27 crc kubenswrapper[4743]: I0122 13:59:27.726119 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" podStartSLOduration=2.728209015 podStartE2EDuration="9.726084638s" podCreationTimestamp="2026-01-22 13:59:18 +0000 UTC" firstStartedPulling="2026-01-22 13:59:19.966488426 +0000 UTC m=+796.521531599" lastFinishedPulling="2026-01-22 13:59:26.964364059 +0000 UTC m=+803.519407222" observedRunningTime="2026-01-22 13:59:27.72127206 +0000 UTC m=+804.276315263" watchObservedRunningTime="2026-01-22 13:59:27.726084638 +0000 UTC m=+804.281127841" Jan 22 13:59:28 crc kubenswrapper[4743]: I0122 13:59:28.714456 4743 generic.go:334] "Generic (PLEG): container finished" podID="2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b" containerID="c3f375a1b4731678672e17b042e0de2d0f7f0abbd6a0fad400cadac9014bda7f" exitCode=0 Jan 22 13:59:28 crc kubenswrapper[4743]: I0122 13:59:28.714618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerDied","Data":"c3f375a1b4731678672e17b042e0de2d0f7f0abbd6a0fad400cadac9014bda7f"} Jan 22 13:59:29 crc kubenswrapper[4743]: I0122 13:59:29.240028 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-4kwcx" Jan 22 13:59:29 crc kubenswrapper[4743]: I0122 13:59:29.722149 4743 generic.go:334] "Generic (PLEG): container finished" podID="2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b" containerID="9c494f10e61fadeda338ccd3e43e4b105808065df6201d0ea17c15b57cb8d596" exitCode=0 Jan 22 13:59:29 crc kubenswrapper[4743]: I0122 13:59:29.723264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerDied","Data":"9c494f10e61fadeda338ccd3e43e4b105808065df6201d0ea17c15b57cb8d596"} Jan 22 13:59:30 crc kubenswrapper[4743]: I0122 13:59:30.741350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"4326789c155846fb8c82df402b75ae54ad68bff5667214b56873bfc99c7dad6c"} Jan 22 13:59:30 crc kubenswrapper[4743]: I0122 13:59:30.741829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"f3baee2c18a1172b507d12e684b9a11243e2eb84f4e33f7b67f272876344123a"} Jan 22 13:59:30 crc kubenswrapper[4743]: I0122 13:59:30.741846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"af4123ca0ff207598f6977763eb3ee4397e57c51c345e4782e7f276890f44c57"} Jan 22 13:59:30 crc kubenswrapper[4743]: I0122 13:59:30.741859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"432cadae8e3f78e3cd8083f1d8f6430e48427d327bf84fc1b27e3d0c69ff1429"} Jan 22 13:59:30 crc kubenswrapper[4743]: I0122 13:59:30.741895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"ed0eb131fe3ce0a1f47fc30c07b79167b002b6995556f5d61e5559cc7580dadf"} Jan 22 13:59:31 crc kubenswrapper[4743]: I0122 13:59:31.754113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-st65g" event={"ID":"2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b","Type":"ContainerStarted","Data":"ed4fd101bbb87845ac22771c7a3e48073a7db5a294edd07f9383e6319212fac5"} Jan 22 13:59:31 crc kubenswrapper[4743]: I0122 13:59:31.754156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:31 crc kubenswrapper[4743]: I0122 13:59:31.778036 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-st65g" podStartSLOduration=6.055880731 podStartE2EDuration="13.778020938s" podCreationTimestamp="2026-01-22 13:59:18 +0000 UTC" firstStartedPulling="2026-01-22 13:59:19.256874962 +0000 UTC m=+795.811918125" lastFinishedPulling="2026-01-22 13:59:26.979015169 +0000 UTC m=+803.534058332" observedRunningTime="2026-01-22 13:59:31.773402625 +0000 UTC m=+808.328445798" watchObservedRunningTime="2026-01-22 13:59:31.778020938 +0000 UTC m=+808.333064101" Jan 22 13:59:34 crc kubenswrapper[4743]: I0122 13:59:34.089601 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:34 crc kubenswrapper[4743]: I0122 13:59:34.126877 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:39 crc kubenswrapper[4743]: I0122 13:59:39.704964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hmp5v" Jan 22 13:59:40 crc kubenswrapper[4743]: I0122 13:59:40.706612 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tl9sw" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.278417 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dsm72"] Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.280072 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.284590 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.284712 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-57g7q" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.284867 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.286633 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsm72"] Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.355835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wm5\" (UniqueName: \"kubernetes.io/projected/2ac18d53-2c89-4de9-8665-29d227f67a09-kube-api-access-k4wm5\") pod \"openstack-operator-index-dsm72\" (UID: \"2ac18d53-2c89-4de9-8665-29d227f67a09\") " pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.457017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wm5\" (UniqueName: \"kubernetes.io/projected/2ac18d53-2c89-4de9-8665-29d227f67a09-kube-api-access-k4wm5\") pod \"openstack-operator-index-dsm72\" (UID: \"2ac18d53-2c89-4de9-8665-29d227f67a09\") " pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.475432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wm5\" (UniqueName: \"kubernetes.io/projected/2ac18d53-2c89-4de9-8665-29d227f67a09-kube-api-access-k4wm5\") pod \"openstack-operator-index-dsm72\" (UID: \"2ac18d53-2c89-4de9-8665-29d227f67a09\") " pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:47 crc kubenswrapper[4743]: I0122 13:59:47.603513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:48 crc kubenswrapper[4743]: I0122 13:59:48.068700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsm72"] Jan 22 13:59:48 crc kubenswrapper[4743]: I0122 13:59:48.892116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsm72" event={"ID":"2ac18d53-2c89-4de9-8665-29d227f67a09","Type":"ContainerStarted","Data":"35a0d3672f25f8ad5801eab1d5e33dfaf533fdd9e599f777c38010a386fbf209"} Jan 22 13:59:49 crc kubenswrapper[4743]: I0122 13:59:49.099328 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-st65g" Jan 22 13:59:50 crc kubenswrapper[4743]: I0122 13:59:50.908353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsm72" event={"ID":"2ac18d53-2c89-4de9-8665-29d227f67a09","Type":"ContainerStarted","Data":"bcf5797d98da9373ecc5fa82bb0d4d7a44372ddbf2b3f75a6f46d47f8efe14f0"} Jan 22 13:59:50 crc kubenswrapper[4743]: I0122 13:59:50.924900 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dsm72" podStartSLOduration=1.913628316 podStartE2EDuration="3.924879009s" podCreationTimestamp="2026-01-22 13:59:47 +0000 UTC" firstStartedPulling="2026-01-22 13:59:48.075816463 +0000 UTC m=+824.630859626" lastFinishedPulling="2026-01-22 13:59:50.087067116 +0000 UTC m=+826.642110319" observedRunningTime="2026-01-22 13:59:50.921377756 +0000 UTC m=+827.476420919" watchObservedRunningTime="2026-01-22 13:59:50.924879009 +0000 UTC m=+827.479922172" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.680860 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.683972 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.695271 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.848322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9nh\" (UniqueName: \"kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.848618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.848834 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.950628 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.950967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.951019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9nh\" (UniqueName: \"kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.951235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.951738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:53 crc kubenswrapper[4743]: I0122 13:59:53.984566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9nh\" (UniqueName: \"kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh\") pod \"certified-operators-fxfpr\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:54 crc kubenswrapper[4743]: I0122 13:59:54.006352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 13:59:54 crc kubenswrapper[4743]: I0122 13:59:54.504906 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 13:59:54 crc kubenswrapper[4743]: W0122 13:59:54.516245 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2437f4_6442_4fab_9b01_b69ad68324bd.slice/crio-cebd41b5b0a7976591176e66f138df62ed020c4042811fbaabccc6ea6d19a063 WatchSource:0}: Error finding container cebd41b5b0a7976591176e66f138df62ed020c4042811fbaabccc6ea6d19a063: Status 404 returned error can't find the container with id cebd41b5b0a7976591176e66f138df62ed020c4042811fbaabccc6ea6d19a063 Jan 22 13:59:54 crc kubenswrapper[4743]: I0122 13:59:54.934122 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerID="f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9" exitCode=0 Jan 22 13:59:54 crc kubenswrapper[4743]: I0122 13:59:54.934164 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerDied","Data":"f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9"} Jan 22 13:59:54 crc kubenswrapper[4743]: I0122 13:59:54.934189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerStarted","Data":"cebd41b5b0a7976591176e66f138df62ed020c4042811fbaabccc6ea6d19a063"} Jan 22 13:59:56 crc kubenswrapper[4743]: I0122 13:59:56.946730 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerID="8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f" exitCode=0 Jan 22 13:59:56 crc kubenswrapper[4743]: I0122 13:59:56.947044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerDied","Data":"8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f"} Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.604849 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.605241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.665063 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.954893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerStarted","Data":"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d"} Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.971701 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxfpr" podStartSLOduration=2.331129174 podStartE2EDuration="4.971687244s" podCreationTimestamp="2026-01-22 13:59:53 +0000 UTC" firstStartedPulling="2026-01-22 13:59:54.936384584 +0000 UTC m=+831.491427747" lastFinishedPulling="2026-01-22 13:59:57.576942614 +0000 UTC m=+834.131985817" observedRunningTime="2026-01-22 13:59:57.968845058 +0000 UTC m=+834.523888221" watchObservedRunningTime="2026-01-22 13:59:57.971687244 +0000 UTC m=+834.526730407" Jan 22 13:59:57 crc kubenswrapper[4743]: I0122 13:59:57.988134 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dsm72" Jan 22 13:59:59 crc kubenswrapper[4743]: I0122 13:59:59.954497 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd"] Jan 22 13:59:59 crc kubenswrapper[4743]: I0122 13:59:59.956263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 13:59:59 crc kubenswrapper[4743]: I0122 13:59:59.959239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2bpvm" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.000040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd"] Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.133758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6cb6\" (UniqueName: \"kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.133943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.133984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.143193 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6"] Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.144080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.146605 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.146963 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.152711 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6"] Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6cb6\" (UniqueName: \"kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gxb\" (UniqueName: \"kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.236360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.237026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.237059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.258866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6cb6\" (UniqueName: \"kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6\") pod \"fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.273879 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.337622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.338196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gxb\" (UniqueName: \"kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.338358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.338863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.343335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.360912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gxb\" (UniqueName: \"kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb\") pod \"collect-profiles-29484840-d7hn6\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.461921 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.715058 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd"] Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.859368 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6"] Jan 22 14:00:00 crc kubenswrapper[4743]: W0122 14:00:00.860263 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode831f897_bb92_4058_98de_256be3386b9f.slice/crio-2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e WatchSource:0}: Error finding container 2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e: Status 404 returned error can't find the container with id 2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.977448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerStarted","Data":"1b6e3b4375931ba6351c031570af5f821b0b2bfde7e5f62eb53db69bedc3af8b"} Jan 22 14:00:00 crc kubenswrapper[4743]: I0122 14:00:00.978571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" event={"ID":"e831f897-bb92-4058-98de-256be3386b9f","Type":"ContainerStarted","Data":"2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e"} Jan 22 14:00:01 crc kubenswrapper[4743]: I0122 14:00:01.985949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerStarted","Data":"58258178a8cd88934c83546d57b57127b3cc3688752244898c13b036c9cf9bb8"} Jan 22 14:00:01 crc kubenswrapper[4743]: I0122 14:00:01.987309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" event={"ID":"e831f897-bb92-4058-98de-256be3386b9f","Type":"ContainerStarted","Data":"3df41680a87ee87067e2b60fbe46d5e4249f256f97129553264ba47462e1e5fa"} Jan 22 14:00:02 crc kubenswrapper[4743]: I0122 14:00:02.994661 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerID="58258178a8cd88934c83546d57b57127b3cc3688752244898c13b036c9cf9bb8" exitCode=0 Jan 22 14:00:02 crc kubenswrapper[4743]: I0122 14:00:02.994730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerDied","Data":"58258178a8cd88934c83546d57b57127b3cc3688752244898c13b036c9cf9bb8"} Jan 22 14:00:02 crc kubenswrapper[4743]: I0122 14:00:02.996134 4743 generic.go:334] "Generic (PLEG): container finished" podID="e831f897-bb92-4058-98de-256be3386b9f" containerID="3df41680a87ee87067e2b60fbe46d5e4249f256f97129553264ba47462e1e5fa" exitCode=0 Jan 22 14:00:02 crc kubenswrapper[4743]: I0122 14:00:02.996177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" event={"ID":"e831f897-bb92-4058-98de-256be3386b9f","Type":"ContainerDied","Data":"3df41680a87ee87067e2b60fbe46d5e4249f256f97129553264ba47462e1e5fa"} Jan 22 14:00:03 crc kubenswrapper[4743]: I0122 14:00:03.013105 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" podStartSLOduration=3.0130726 podStartE2EDuration="3.0130726s" podCreationTimestamp="2026-01-22 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:00:02.021546468 +0000 UTC m=+838.576589641" watchObservedRunningTime="2026-01-22 14:00:03.0130726 +0000 UTC m=+839.568115763" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.004279 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerID="e32058b1f7ead4673c14037e9928740135fc1375f2a1fb3552821c631803bd59" exitCode=0 Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.004370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerDied","Data":"e32058b1f7ead4673c14037e9928740135fc1375f2a1fb3552821c631803bd59"} Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.006621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.006727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.088560 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.278436 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.397203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume\") pod \"e831f897-bb92-4058-98de-256be3386b9f\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.397473 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume\") pod \"e831f897-bb92-4058-98de-256be3386b9f\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.397517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gxb\" (UniqueName: \"kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb\") pod \"e831f897-bb92-4058-98de-256be3386b9f\" (UID: \"e831f897-bb92-4058-98de-256be3386b9f\") " Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.400280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e831f897-bb92-4058-98de-256be3386b9f" (UID: "e831f897-bb92-4058-98de-256be3386b9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.405812 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e831f897-bb92-4058-98de-256be3386b9f" (UID: "e831f897-bb92-4058-98de-256be3386b9f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.407025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb" (OuterVolumeSpecName: "kube-api-access-x7gxb") pod "e831f897-bb92-4058-98de-256be3386b9f" (UID: "e831f897-bb92-4058-98de-256be3386b9f"). InnerVolumeSpecName "kube-api-access-x7gxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.499731 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e831f897-bb92-4058-98de-256be3386b9f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.500133 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gxb\" (UniqueName: \"kubernetes.io/projected/e831f897-bb92-4058-98de-256be3386b9f-kube-api-access-x7gxb\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:04 crc kubenswrapper[4743]: I0122 14:00:04.500204 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e831f897-bb92-4058-98de-256be3386b9f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.016282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" event={"ID":"e831f897-bb92-4058-98de-256be3386b9f","Type":"ContainerDied","Data":"2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e"} Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.016606 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c73467c73b989fa333e74bb01baf2e254186a55a07078401b918dafef94e55e" Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.016365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6" Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.019835 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerID="c3f6bdd73a9a45cd5c0eea66b73cf385ccd39d4a1cf1cd0a9b0e6c78e81bcdf1" exitCode=0 Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.019966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerDied","Data":"c3f6bdd73a9a45cd5c0eea66b73cf385ccd39d4a1cf1cd0a9b0e6c78e81bcdf1"} Jan 22 14:00:05 crc kubenswrapper[4743]: I0122 14:00:05.075089 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.302210 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.426414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6cb6\" (UniqueName: \"kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6\") pod \"3c8fbd52-c53e-4cb9-9087-483276a7c607\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.426499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle\") pod \"3c8fbd52-c53e-4cb9-9087-483276a7c607\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.426623 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util\") pod \"3c8fbd52-c53e-4cb9-9087-483276a7c607\" (UID: \"3c8fbd52-c53e-4cb9-9087-483276a7c607\") " Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.428856 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle" (OuterVolumeSpecName: "bundle") pod "3c8fbd52-c53e-4cb9-9087-483276a7c607" (UID: "3c8fbd52-c53e-4cb9-9087-483276a7c607"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.434087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6" (OuterVolumeSpecName: "kube-api-access-z6cb6") pod "3c8fbd52-c53e-4cb9-9087-483276a7c607" (UID: "3c8fbd52-c53e-4cb9-9087-483276a7c607"). InnerVolumeSpecName "kube-api-access-z6cb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.440079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util" (OuterVolumeSpecName: "util") pod "3c8fbd52-c53e-4cb9-9087-483276a7c607" (UID: "3c8fbd52-c53e-4cb9-9087-483276a7c607"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.527960 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6cb6\" (UniqueName: \"kubernetes.io/projected/3c8fbd52-c53e-4cb9-9087-483276a7c607-kube-api-access-z6cb6\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.528000 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.528012 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c8fbd52-c53e-4cb9-9087-483276a7c607-util\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:06 crc kubenswrapper[4743]: I0122 14:00:06.666839 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.034321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" event={"ID":"3c8fbd52-c53e-4cb9-9087-483276a7c607","Type":"ContainerDied","Data":"1b6e3b4375931ba6351c031570af5f821b0b2bfde7e5f62eb53db69bedc3af8b"} Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.034679 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b6e3b4375931ba6351c031570af5f821b0b2bfde7e5f62eb53db69bedc3af8b" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.034495 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fxfpr" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="registry-server" containerID="cri-o://974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d" gracePeriod=2 Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.034399 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.444023 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.645792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content\") pod \"eb2437f4-6442-4fab-9b01-b69ad68324bd\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.646234 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities\") pod \"eb2437f4-6442-4fab-9b01-b69ad68324bd\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.646320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9nh\" (UniqueName: \"kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh\") pod \"eb2437f4-6442-4fab-9b01-b69ad68324bd\" (UID: \"eb2437f4-6442-4fab-9b01-b69ad68324bd\") " Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.646896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities" (OuterVolumeSpecName: "utilities") pod "eb2437f4-6442-4fab-9b01-b69ad68324bd" (UID: "eb2437f4-6442-4fab-9b01-b69ad68324bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.650980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh" (OuterVolumeSpecName: "kube-api-access-fb9nh") pod "eb2437f4-6442-4fab-9b01-b69ad68324bd" (UID: "eb2437f4-6442-4fab-9b01-b69ad68324bd"). InnerVolumeSpecName "kube-api-access-fb9nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.701482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb2437f4-6442-4fab-9b01-b69ad68324bd" (UID: "eb2437f4-6442-4fab-9b01-b69ad68324bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.747657 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.747705 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2437f4-6442-4fab-9b01-b69ad68324bd-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:07 crc kubenswrapper[4743]: I0122 14:00:07.747719 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9nh\" (UniqueName: \"kubernetes.io/projected/eb2437f4-6442-4fab-9b01-b69ad68324bd-kube-api-access-fb9nh\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.041837 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerID="974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d" exitCode=0 Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.041883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerDied","Data":"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d"} Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.041912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxfpr" event={"ID":"eb2437f4-6442-4fab-9b01-b69ad68324bd","Type":"ContainerDied","Data":"cebd41b5b0a7976591176e66f138df62ed020c4042811fbaabccc6ea6d19a063"} Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.041927 4743 scope.go:117] "RemoveContainer" containerID="974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.042073 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxfpr" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.060020 4743 scope.go:117] "RemoveContainer" containerID="8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.065382 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.071646 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fxfpr"] Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.098606 4743 scope.go:117] "RemoveContainer" containerID="f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.122902 4743 scope.go:117] "RemoveContainer" containerID="974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d" Jan 22 14:00:08 crc kubenswrapper[4743]: E0122 14:00:08.123343 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d\": container with ID starting with 974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d not found: ID does not exist" containerID="974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.123478 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d"} err="failed to get container status \"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d\": rpc error: code = NotFound desc = could not find container \"974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d\": container with ID starting with 974b35966d9d9f38664fd75f4fc35b7f6432b030766e33a2a4435282e9d1407d not found: ID does not exist" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.123579 4743 scope.go:117] "RemoveContainer" containerID="8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f" Jan 22 14:00:08 crc kubenswrapper[4743]: E0122 14:00:08.124089 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f\": container with ID starting with 8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f not found: ID does not exist" containerID="8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.124121 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f"} err="failed to get container status \"8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f\": rpc error: code = NotFound desc = could not find container \"8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f\": container with ID starting with 8713c527e39d946fe856a93f6de2ba5468eaa7e2c7a4fa10d4af5e68aa5f508f not found: ID does not exist" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.124141 4743 scope.go:117] "RemoveContainer" containerID="f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9" Jan 22 14:00:08 crc kubenswrapper[4743]: E0122 14:00:08.124397 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9\": container with ID starting with f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9 not found: ID does not exist" containerID="f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9" Jan 22 14:00:08 crc kubenswrapper[4743]: I0122 14:00:08.124424 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9"} err="failed to get container status \"f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9\": rpc error: code = NotFound desc = could not find container \"f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9\": container with ID starting with f03921aea0020b285c4cb55c79e58ad3897c2637441ca4bde9e659e9d3fbe0d9 not found: ID does not exist" Jan 22 14:00:09 crc kubenswrapper[4743]: I0122 14:00:09.757710 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" path="/var/lib/kubelet/pods/eb2437f4-6442-4fab-9b01-b69ad68324bd/volumes" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391249 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6"] Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391538 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="extract" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391555 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="extract" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391567 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="extract-content" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391575 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="extract-content" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391587 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="extract-utilities" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391594 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="extract-utilities" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391606 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="pull" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391615 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="pull" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391629 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="registry-server" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391635 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="registry-server" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391648 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e831f897-bb92-4058-98de-256be3386b9f" containerName="collect-profiles" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391656 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e831f897-bb92-4058-98de-256be3386b9f" containerName="collect-profiles" Jan 22 14:00:11 crc kubenswrapper[4743]: E0122 14:00:11.391669 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="util" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391677 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="util" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391826 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e831f897-bb92-4058-98de-256be3386b9f" containerName="collect-profiles" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391847 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8fbd52-c53e-4cb9-9087-483276a7c607" containerName="extract" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.391858 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2437f4-6442-4fab-9b01-b69ad68324bd" containerName="registry-server" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.392342 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.394696 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-jfs5x" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.413503 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6"] Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.523383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7764t\" (UniqueName: \"kubernetes.io/projected/80c8233a-0396-4d24-8212-53346af8d405-kube-api-access-7764t\") pod \"openstack-operator-controller-init-6ddb855d8-zmpc6\" (UID: \"80c8233a-0396-4d24-8212-53346af8d405\") " pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.626097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7764t\" (UniqueName: \"kubernetes.io/projected/80c8233a-0396-4d24-8212-53346af8d405-kube-api-access-7764t\") pod \"openstack-operator-controller-init-6ddb855d8-zmpc6\" (UID: \"80c8233a-0396-4d24-8212-53346af8d405\") " pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.648627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7764t\" (UniqueName: \"kubernetes.io/projected/80c8233a-0396-4d24-8212-53346af8d405-kube-api-access-7764t\") pod \"openstack-operator-controller-init-6ddb855d8-zmpc6\" (UID: \"80c8233a-0396-4d24-8212-53346af8d405\") " pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.709557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.875558 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.878792 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.900192 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:11 crc kubenswrapper[4743]: I0122 14:00:11.975928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6"] Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.032183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.032232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75ds\" (UniqueName: \"kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.032252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.069989 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" event={"ID":"80c8233a-0396-4d24-8212-53346af8d405","Type":"ContainerStarted","Data":"f54f811b013e79405379013e0068d7fd99e386a6f6e19f12deaeffe6e73e64f8"} Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.133298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.133347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75ds\" (UniqueName: \"kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.133363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.133964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.133997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.153476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75ds\" (UniqueName: \"kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds\") pod \"redhat-marketplace-v7cnb\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.205268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:12 crc kubenswrapper[4743]: I0122 14:00:12.408298 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:12 crc kubenswrapper[4743]: W0122 14:00:12.423201 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae39979_4adc_468d_a4f1_2d544221e822.slice/crio-cb922c7dc794122734d77df26c293ff3955e3b9aed774ad6198f0ea61becdcb3 WatchSource:0}: Error finding container cb922c7dc794122734d77df26c293ff3955e3b9aed774ad6198f0ea61becdcb3: Status 404 returned error can't find the container with id cb922c7dc794122734d77df26c293ff3955e3b9aed774ad6198f0ea61becdcb3 Jan 22 14:00:13 crc kubenswrapper[4743]: I0122 14:00:13.077847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerStarted","Data":"cb922c7dc794122734d77df26c293ff3955e3b9aed774ad6198f0ea61becdcb3"} Jan 22 14:00:14 crc kubenswrapper[4743]: I0122 14:00:14.087852 4743 generic.go:334] "Generic (PLEG): container finished" podID="4ae39979-4adc-468d-a4f1-2d544221e822" containerID="7526a94c92a3c5c82310c9066116773422fad30b3ede613d6cfc91dbf7fdd96d" exitCode=0 Jan 22 14:00:14 crc kubenswrapper[4743]: I0122 14:00:14.087898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerDied","Data":"7526a94c92a3c5c82310c9066116773422fad30b3ede613d6cfc91dbf7fdd96d"} Jan 22 14:00:19 crc kubenswrapper[4743]: I0122 14:00:19.119138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" event={"ID":"80c8233a-0396-4d24-8212-53346af8d405","Type":"ContainerStarted","Data":"ac61338cce5346752b76c6fd27720eca4412d659a03f41356074b5585ae63dcf"} Jan 22 14:00:19 crc kubenswrapper[4743]: I0122 14:00:19.119642 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:19 crc kubenswrapper[4743]: I0122 14:00:19.120706 4743 generic.go:334] "Generic (PLEG): container finished" podID="4ae39979-4adc-468d-a4f1-2d544221e822" containerID="e285d17b619f28e4901b0c70c7471f51fe328a556568329612faf499c9bb7443" exitCode=0 Jan 22 14:00:19 crc kubenswrapper[4743]: I0122 14:00:19.120733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerDied","Data":"e285d17b619f28e4901b0c70c7471f51fe328a556568329612faf499c9bb7443"} Jan 22 14:00:19 crc kubenswrapper[4743]: I0122 14:00:19.149929 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" podStartSLOduration=1.327897001 podStartE2EDuration="8.14991509s" podCreationTimestamp="2026-01-22 14:00:11 +0000 UTC" firstStartedPulling="2026-01-22 14:00:11.979363655 +0000 UTC m=+848.534406818" lastFinishedPulling="2026-01-22 14:00:18.801381754 +0000 UTC m=+855.356424907" observedRunningTime="2026-01-22 14:00:19.144574407 +0000 UTC m=+855.699617590" watchObservedRunningTime="2026-01-22 14:00:19.14991509 +0000 UTC m=+855.704958253" Jan 22 14:00:20 crc kubenswrapper[4743]: I0122 14:00:20.128809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerStarted","Data":"e3e0a5d2c71efe12e2e20fe4683117ad34d66b85b97edb0fce8c28c0e62ae0ca"} Jan 22 14:00:20 crc kubenswrapper[4743]: I0122 14:00:20.150222 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v7cnb" podStartSLOduration=3.702149265 podStartE2EDuration="9.150206554s" podCreationTimestamp="2026-01-22 14:00:11 +0000 UTC" firstStartedPulling="2026-01-22 14:00:14.090214312 +0000 UTC m=+850.645257465" lastFinishedPulling="2026-01-22 14:00:19.538271591 +0000 UTC m=+856.093314754" observedRunningTime="2026-01-22 14:00:20.145406835 +0000 UTC m=+856.700449998" watchObservedRunningTime="2026-01-22 14:00:20.150206554 +0000 UTC m=+856.705249707" Jan 22 14:00:22 crc kubenswrapper[4743]: I0122 14:00:22.205607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:22 crc kubenswrapper[4743]: I0122 14:00:22.206028 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:22 crc kubenswrapper[4743]: I0122 14:00:22.256351 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.672534 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.673672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.680973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.773001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.773347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswgg\" (UniqueName: \"kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.773474 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.874512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.874622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.874681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswgg\" (UniqueName: \"kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.875600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.875644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:25 crc kubenswrapper[4743]: I0122 14:00:25.905919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswgg\" (UniqueName: \"kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg\") pod \"community-operators-7xhnf\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:26 crc kubenswrapper[4743]: I0122 14:00:26.033298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:26 crc kubenswrapper[4743]: I0122 14:00:26.511388 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:27 crc kubenswrapper[4743]: I0122 14:00:27.175680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerStarted","Data":"c5e3e62ead33c3aa2afe6974e09858c67c536d000a57cc61ffba0c1ef24d56d3"} Jan 22 14:00:28 crc kubenswrapper[4743]: I0122 14:00:28.185096 4743 generic.go:334] "Generic (PLEG): container finished" podID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerID="aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00" exitCode=0 Jan 22 14:00:28 crc kubenswrapper[4743]: I0122 14:00:28.185152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerDied","Data":"aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00"} Jan 22 14:00:30 crc kubenswrapper[4743]: I0122 14:00:30.049623 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:00:30 crc kubenswrapper[4743]: I0122 14:00:30.049688 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:00:30 crc kubenswrapper[4743]: I0122 14:00:30.198350 4743 generic.go:334] "Generic (PLEG): container finished" podID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerID="26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092" exitCode=0 Jan 22 14:00:30 crc kubenswrapper[4743]: I0122 14:00:30.198397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerDied","Data":"26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092"} Jan 22 14:00:31 crc kubenswrapper[4743]: I0122 14:00:31.206280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerStarted","Data":"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118"} Jan 22 14:00:31 crc kubenswrapper[4743]: I0122 14:00:31.226920 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7xhnf" podStartSLOduration=3.821630416 podStartE2EDuration="6.226898081s" podCreationTimestamp="2026-01-22 14:00:25 +0000 UTC" firstStartedPulling="2026-01-22 14:00:28.187799737 +0000 UTC m=+864.742842900" lastFinishedPulling="2026-01-22 14:00:30.593067402 +0000 UTC m=+867.148110565" observedRunningTime="2026-01-22 14:00:31.221626629 +0000 UTC m=+867.776669792" watchObservedRunningTime="2026-01-22 14:00:31.226898081 +0000 UTC m=+867.781941244" Jan 22 14:00:31 crc kubenswrapper[4743]: I0122 14:00:31.712816 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6ddb855d8-zmpc6" Jan 22 14:00:32 crc kubenswrapper[4743]: I0122 14:00:32.246031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:36 crc kubenswrapper[4743]: I0122 14:00:36.033694 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:36 crc kubenswrapper[4743]: I0122 14:00:36.034151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:36 crc kubenswrapper[4743]: I0122 14:00:36.081368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:36 crc kubenswrapper[4743]: I0122 14:00:36.277823 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:37 crc kubenswrapper[4743]: I0122 14:00:37.071187 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:37 crc kubenswrapper[4743]: I0122 14:00:37.071452 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v7cnb" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="registry-server" containerID="cri-o://e3e0a5d2c71efe12e2e20fe4683117ad34d66b85b97edb0fce8c28c0e62ae0ca" gracePeriod=2 Jan 22 14:00:37 crc kubenswrapper[4743]: I0122 14:00:37.668384 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.251022 4743 generic.go:334] "Generic (PLEG): container finished" podID="4ae39979-4adc-468d-a4f1-2d544221e822" containerID="e3e0a5d2c71efe12e2e20fe4683117ad34d66b85b97edb0fce8c28c0e62ae0ca" exitCode=0 Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.251117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerDied","Data":"e3e0a5d2c71efe12e2e20fe4683117ad34d66b85b97edb0fce8c28c0e62ae0ca"} Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.252109 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7xhnf" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="registry-server" containerID="cri-o://ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118" gracePeriod=2 Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.586665 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.719869 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.750621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities\") pod \"4ae39979-4adc-468d-a4f1-2d544221e822\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.750815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content\") pod \"4ae39979-4adc-468d-a4f1-2d544221e822\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.750916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v75ds\" (UniqueName: \"kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds\") pod \"4ae39979-4adc-468d-a4f1-2d544221e822\" (UID: \"4ae39979-4adc-468d-a4f1-2d544221e822\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.752839 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities" (OuterVolumeSpecName: "utilities") pod "4ae39979-4adc-468d-a4f1-2d544221e822" (UID: "4ae39979-4adc-468d-a4f1-2d544221e822"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.759202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds" (OuterVolumeSpecName: "kube-api-access-v75ds") pod "4ae39979-4adc-468d-a4f1-2d544221e822" (UID: "4ae39979-4adc-468d-a4f1-2d544221e822"). InnerVolumeSpecName "kube-api-access-v75ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.777538 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae39979-4adc-468d-a4f1-2d544221e822" (UID: "4ae39979-4adc-468d-a4f1-2d544221e822"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswgg\" (UniqueName: \"kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg\") pod \"f1f18e46-7670-4214-b2b5-77e0597b44b8\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852502 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content\") pod \"f1f18e46-7670-4214-b2b5-77e0597b44b8\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities\") pod \"f1f18e46-7670-4214-b2b5-77e0597b44b8\" (UID: \"f1f18e46-7670-4214-b2b5-77e0597b44b8\") " Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852912 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852927 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v75ds\" (UniqueName: \"kubernetes.io/projected/4ae39979-4adc-468d-a4f1-2d544221e822-kube-api-access-v75ds\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.852937 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae39979-4adc-468d-a4f1-2d544221e822-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.853956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities" (OuterVolumeSpecName: "utilities") pod "f1f18e46-7670-4214-b2b5-77e0597b44b8" (UID: "f1f18e46-7670-4214-b2b5-77e0597b44b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.857008 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg" (OuterVolumeSpecName: "kube-api-access-bswgg") pod "f1f18e46-7670-4214-b2b5-77e0597b44b8" (UID: "f1f18e46-7670-4214-b2b5-77e0597b44b8"). InnerVolumeSpecName "kube-api-access-bswgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.926982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1f18e46-7670-4214-b2b5-77e0597b44b8" (UID: "f1f18e46-7670-4214-b2b5-77e0597b44b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.954299 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswgg\" (UniqueName: \"kubernetes.io/projected/f1f18e46-7670-4214-b2b5-77e0597b44b8-kube-api-access-bswgg\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.954338 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:38 crc kubenswrapper[4743]: I0122 14:00:38.954349 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f18e46-7670-4214-b2b5-77e0597b44b8-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.260712 4743 generic.go:334] "Generic (PLEG): container finished" podID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerID="ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118" exitCode=0 Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.260777 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xhnf" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.260777 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerDied","Data":"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118"} Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.261025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xhnf" event={"ID":"f1f18e46-7670-4214-b2b5-77e0597b44b8","Type":"ContainerDied","Data":"c5e3e62ead33c3aa2afe6974e09858c67c536d000a57cc61ffba0c1ef24d56d3"} Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.261076 4743 scope.go:117] "RemoveContainer" containerID="ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.264694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7cnb" event={"ID":"4ae39979-4adc-468d-a4f1-2d544221e822","Type":"ContainerDied","Data":"cb922c7dc794122734d77df26c293ff3955e3b9aed774ad6198f0ea61becdcb3"} Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.264882 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7cnb" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.290859 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.292197 4743 scope.go:117] "RemoveContainer" containerID="26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.294575 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7xhnf"] Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.310917 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.317826 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7cnb"] Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.321576 4743 scope.go:117] "RemoveContainer" containerID="aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.337119 4743 scope.go:117] "RemoveContainer" containerID="ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118" Jan 22 14:00:39 crc kubenswrapper[4743]: E0122 14:00:39.337879 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118\": container with ID starting with ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118 not found: ID does not exist" containerID="ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.337913 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118"} err="failed to get container status \"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118\": rpc error: code = NotFound desc = could not find container \"ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118\": container with ID starting with ac8fd7a91265240154ddb98bd13182a31fdbd415829aadc819f2f11e28eee118 not found: ID does not exist" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.337935 4743 scope.go:117] "RemoveContainer" containerID="26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092" Jan 22 14:00:39 crc kubenswrapper[4743]: E0122 14:00:39.338299 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092\": container with ID starting with 26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092 not found: ID does not exist" containerID="26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.338341 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092"} err="failed to get container status \"26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092\": rpc error: code = NotFound desc = could not find container \"26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092\": container with ID starting with 26c49f3cd81beae7b6ed64febec7af93dc6f5e6bc06cd1dc7626f096677ee092 not found: ID does not exist" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.338375 4743 scope.go:117] "RemoveContainer" containerID="aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00" Jan 22 14:00:39 crc kubenswrapper[4743]: E0122 14:00:39.338706 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00\": container with ID starting with aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00 not found: ID does not exist" containerID="aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.338737 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00"} err="failed to get container status \"aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00\": rpc error: code = NotFound desc = could not find container \"aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00\": container with ID starting with aa030e04c439a2333459e9fc99fb1aa4b1249fe0d5d8dd676764fb2ea9ed1e00 not found: ID does not exist" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.338753 4743 scope.go:117] "RemoveContainer" containerID="e3e0a5d2c71efe12e2e20fe4683117ad34d66b85b97edb0fce8c28c0e62ae0ca" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.353435 4743 scope.go:117] "RemoveContainer" containerID="e285d17b619f28e4901b0c70c7471f51fe328a556568329612faf499c9bb7443" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.398115 4743 scope.go:117] "RemoveContainer" containerID="7526a94c92a3c5c82310c9066116773422fad30b3ede613d6cfc91dbf7fdd96d" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.764662 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" path="/var/lib/kubelet/pods/4ae39979-4adc-468d-a4f1-2d544221e822/volumes" Jan 22 14:00:39 crc kubenswrapper[4743]: I0122 14:00:39.765994 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" path="/var/lib/kubelet/pods/f1f18e46-7670-4214-b2b5-77e0597b44b8/volumes" Jan 22 14:00:49 crc kubenswrapper[4743]: I0122 14:00:49.787690 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv"] Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788514 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="extract-content" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="extract-content" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788543 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="extract-content" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788551 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="extract-content" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788562 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788569 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788582 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788590 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788611 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="extract-utilities" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788618 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="extract-utilities" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:49.788631 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="extract-utilities" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788640 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="extract-utilities" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788781 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae39979-4adc-468d-a4f1-2d544221e822" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.788811 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f18e46-7670-4214-b2b5-77e0597b44b8" containerName="registry-server" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.789312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.794278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w9swj" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.800687 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.801451 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.803411 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bw5lw" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.806470 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.808984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.812518 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.813559 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-58vls" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.822944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.832309 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.833259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.837044 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pmxt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.848235 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.849254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.851261 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ncgh5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.855780 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.863182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.864687 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.885922 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.886657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.897145 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-skfk9" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.900873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/4eb53c43-8c71-4c15-862a-134fa6eb85d6-kube-api-access-mzzgr\") pod \"barbican-operator-controller-manager-59dd8b7cbf-6kvwx\" (UID: \"4eb53c43-8c71-4c15-862a-134fa6eb85d6\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.900911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r69k\" (UniqueName: \"kubernetes.io/projected/aff36600-9c00-4a26-b311-a3d743333b0e-kube-api-access-2r69k\") pod \"designate-operator-controller-manager-b45d7bf98-w8s2s\" (UID: \"aff36600-9c00-4a26-b311-a3d743333b0e\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.900931 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kdxm\" (UniqueName: \"kubernetes.io/projected/f6b9f418-b721-4fce-881e-791eceb6b0ef-kube-api-access-9kdxm\") pod \"cinder-operator-controller-manager-69cf5d4557-dn2mv\" (UID: \"f6b9f418-b721-4fce-881e-791eceb6b0ef\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.904288 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.905404 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.909323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.909949 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7q6zn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.924719 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.925716 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.930668 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-msh6n" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.949490 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.966970 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:49.985333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.000812 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.001613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r69k\" (UniqueName: \"kubernetes.io/projected/aff36600-9c00-4a26-b311-a3d743333b0e-kube-api-access-2r69k\") pod \"designate-operator-controller-manager-b45d7bf98-w8s2s\" (UID: \"aff36600-9c00-4a26-b311-a3d743333b0e\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kdxm\" (UniqueName: \"kubernetes.io/projected/f6b9f418-b721-4fce-881e-791eceb6b0ef-kube-api-access-9kdxm\") pod \"cinder-operator-controller-manager-69cf5d4557-dn2mv\" (UID: \"f6b9f418-b721-4fce-881e-791eceb6b0ef\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v5l\" (UniqueName: \"kubernetes.io/projected/bad16498-5eda-4791-8577-6cf6ef07ca2a-kube-api-access-p4v5l\") pod \"heat-operator-controller-manager-594c8c9d5d-mxdkr\" (UID: \"bad16498-5eda-4791-8577-6cf6ef07ca2a\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5j8k\" (UniqueName: \"kubernetes.io/projected/1fcc87bc-de60-44e2-b8b9-88c97eb2aec4-kube-api-access-b5j8k\") pod \"horizon-operator-controller-manager-77d5c5b54f-f7dhp\" (UID: \"1fcc87bc-de60-44e2-b8b9-88c97eb2aec4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxp8w\" (UniqueName: \"kubernetes.io/projected/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-kube-api-access-rxp8w\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2tx\" (UniqueName: \"kubernetes.io/projected/3454a999-851a-47d1-ba12-64f77de4bd6a-kube-api-access-lv2tx\") pod \"glance-operator-controller-manager-78fdd796fd-mr8bn\" (UID: \"3454a999-851a-47d1-ba12-64f77de4bd6a\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.003768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/4eb53c43-8c71-4c15-862a-134fa6eb85d6-kube-api-access-mzzgr\") pod \"barbican-operator-controller-manager-59dd8b7cbf-6kvwx\" (UID: \"4eb53c43-8c71-4c15-862a-134fa6eb85d6\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.042875 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.061423 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2t62b" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.089697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r69k\" (UniqueName: \"kubernetes.io/projected/aff36600-9c00-4a26-b311-a3d743333b0e-kube-api-access-2r69k\") pod \"designate-operator-controller-manager-b45d7bf98-w8s2s\" (UID: \"aff36600-9c00-4a26-b311-a3d743333b0e\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.089703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kdxm\" (UniqueName: \"kubernetes.io/projected/f6b9f418-b721-4fce-881e-791eceb6b0ef-kube-api-access-9kdxm\") pod \"cinder-operator-controller-manager-69cf5d4557-dn2mv\" (UID: \"f6b9f418-b721-4fce-881e-791eceb6b0ef\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.098511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzgr\" (UniqueName: \"kubernetes.io/projected/4eb53c43-8c71-4c15-862a-134fa6eb85d6-kube-api-access-mzzgr\") pod \"barbican-operator-controller-manager-59dd8b7cbf-6kvwx\" (UID: \"4eb53c43-8c71-4c15-862a-134fa6eb85d6\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.106850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrrs\" (UniqueName: \"kubernetes.io/projected/6221bb17-765b-4d72-8a74-70cdbc3447d9-kube-api-access-llrrs\") pod \"ironic-operator-controller-manager-69d6c9f5b8-nqr46\" (UID: \"6221bb17-765b-4d72-8a74-70cdbc3447d9\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.106904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lvf\" (UniqueName: \"kubernetes.io/projected/3e262e2d-6d13-4c04-9826-14ed89dde8ea-kube-api-access-w8lvf\") pod \"keystone-operator-controller-manager-b8b6d4659-24n5k\" (UID: \"3e262e2d-6d13-4c04-9826-14ed89dde8ea\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.106932 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.106958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v5l\" (UniqueName: \"kubernetes.io/projected/bad16498-5eda-4791-8577-6cf6ef07ca2a-kube-api-access-p4v5l\") pod \"heat-operator-controller-manager-594c8c9d5d-mxdkr\" (UID: \"bad16498-5eda-4791-8577-6cf6ef07ca2a\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.106980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5j8k\" (UniqueName: \"kubernetes.io/projected/1fcc87bc-de60-44e2-b8b9-88c97eb2aec4-kube-api-access-b5j8k\") pod \"horizon-operator-controller-manager-77d5c5b54f-f7dhp\" (UID: \"1fcc87bc-de60-44e2-b8b9-88c97eb2aec4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.107010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxp8w\" (UniqueName: \"kubernetes.io/projected/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-kube-api-access-rxp8w\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.107033 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2tx\" (UniqueName: \"kubernetes.io/projected/3454a999-851a-47d1-ba12-64f77de4bd6a-kube-api-access-lv2tx\") pod \"glance-operator-controller-manager-78fdd796fd-mr8bn\" (UID: \"3454a999-851a-47d1-ba12-64f77de4bd6a\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.107251 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.107316 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert podName:7df36228-9543-4bb1-a0a7-d2ca51ac35a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:50.607297661 +0000 UTC m=+887.162340824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert") pod "infra-operator-controller-manager-54ccf4f85d-8xxtr" (UID: "7df36228-9543-4bb1-a0a7-d2ca51ac35a5") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.131543 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.133908 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.140135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.141112 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-42kkx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.149563 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.157438 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.158278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.160075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v5l\" (UniqueName: \"kubernetes.io/projected/bad16498-5eda-4791-8577-6cf6ef07ca2a-kube-api-access-p4v5l\") pod \"heat-operator-controller-manager-594c8c9d5d-mxdkr\" (UID: \"bad16498-5eda-4791-8577-6cf6ef07ca2a\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.160825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5j8k\" (UniqueName: \"kubernetes.io/projected/1fcc87bc-de60-44e2-b8b9-88c97eb2aec4-kube-api-access-b5j8k\") pod \"horizon-operator-controller-manager-77d5c5b54f-f7dhp\" (UID: \"1fcc87bc-de60-44e2-b8b9-88c97eb2aec4\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.171871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.173106 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.173707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xwfzs" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.173900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxp8w\" (UniqueName: \"kubernetes.io/projected/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-kube-api-access-rxp8w\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.173906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2tx\" (UniqueName: \"kubernetes.io/projected/3454a999-851a-47d1-ba12-64f77de4bd6a-kube-api-access-lv2tx\") pod \"glance-operator-controller-manager-78fdd796fd-mr8bn\" (UID: \"3454a999-851a-47d1-ba12-64f77de4bd6a\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.185162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.208904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bggj\" (UniqueName: \"kubernetes.io/projected/96859e4c-bbb4-424b-bc02-2bd6e3b03484-kube-api-access-5bggj\") pod \"manila-operator-controller-manager-78c6999f6f-b77x5\" (UID: \"96859e4c-bbb4-424b-bc02-2bd6e3b03484\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.208956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lvf\" (UniqueName: \"kubernetes.io/projected/3e262e2d-6d13-4c04-9826-14ed89dde8ea-kube-api-access-w8lvf\") pod \"keystone-operator-controller-manager-b8b6d4659-24n5k\" (UID: \"3e262e2d-6d13-4c04-9826-14ed89dde8ea\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.209007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfx6n\" (UniqueName: \"kubernetes.io/projected/924b89fa-b3de-46d6-b9c8-5be5e6d4795c-kube-api-access-dfx6n\") pod \"mariadb-operator-controller-manager-c87fff755-jtwg6\" (UID: \"924b89fa-b3de-46d6-b9c8-5be5e6d4795c\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.209115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrrs\" (UniqueName: \"kubernetes.io/projected/6221bb17-765b-4d72-8a74-70cdbc3447d9-kube-api-access-llrrs\") pod \"ironic-operator-controller-manager-69d6c9f5b8-nqr46\" (UID: \"6221bb17-765b-4d72-8a74-70cdbc3447d9\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.209198 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.218682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.235582 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.252646 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.253371 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.263275 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jbcl9" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.266016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrrs\" (UniqueName: \"kubernetes.io/projected/6221bb17-765b-4d72-8a74-70cdbc3447d9-kube-api-access-llrrs\") pod \"ironic-operator-controller-manager-69d6c9f5b8-nqr46\" (UID: \"6221bb17-765b-4d72-8a74-70cdbc3447d9\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.269755 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.289042 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.289450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lvf\" (UniqueName: \"kubernetes.io/projected/3e262e2d-6d13-4c04-9826-14ed89dde8ea-kube-api-access-w8lvf\") pod \"keystone-operator-controller-manager-b8b6d4659-24n5k\" (UID: \"3e262e2d-6d13-4c04-9826-14ed89dde8ea\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.291212 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.300660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.300954 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lfjg4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.312556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djk6v\" (UniqueName: \"kubernetes.io/projected/064ac5cb-7d15-4502-b174-54236cdd0d51-kube-api-access-djk6v\") pod \"neutron-operator-controller-manager-5d8f59fb49-mhq65\" (UID: \"064ac5cb-7d15-4502-b174-54236cdd0d51\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.312645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bggj\" (UniqueName: \"kubernetes.io/projected/96859e4c-bbb4-424b-bc02-2bd6e3b03484-kube-api-access-5bggj\") pod \"manila-operator-controller-manager-78c6999f6f-b77x5\" (UID: \"96859e4c-bbb4-424b-bc02-2bd6e3b03484\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.312698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfx6n\" (UniqueName: \"kubernetes.io/projected/924b89fa-b3de-46d6-b9c8-5be5e6d4795c-kube-api-access-dfx6n\") pod \"mariadb-operator-controller-manager-c87fff755-jtwg6\" (UID: \"924b89fa-b3de-46d6-b9c8-5be5e6d4795c\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.323026 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.343952 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.344300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfx6n\" (UniqueName: \"kubernetes.io/projected/924b89fa-b3de-46d6-b9c8-5be5e6d4795c-kube-api-access-dfx6n\") pod \"mariadb-operator-controller-manager-c87fff755-jtwg6\" (UID: \"924b89fa-b3de-46d6-b9c8-5be5e6d4795c\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.344949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.346854 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bggj\" (UniqueName: \"kubernetes.io/projected/96859e4c-bbb4-424b-bc02-2bd6e3b03484-kube-api-access-5bggj\") pod \"manila-operator-controller-manager-78c6999f6f-b77x5\" (UID: \"96859e4c-bbb4-424b-bc02-2bd6e3b03484\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.355444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-84tt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.388607 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.410755 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.414806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5s98\" (UniqueName: \"kubernetes.io/projected/d1e52325-1801-4650-86bd-c1eb8f076714-kube-api-access-w5s98\") pod \"nova-operator-controller-manager-6b8bc8d87d-mppfg\" (UID: \"d1e52325-1801-4650-86bd-c1eb8f076714\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.414928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctrp\" (UniqueName: \"kubernetes.io/projected/3b59a905-2607-4445-abee-ba43a1bdf41c-kube-api-access-6ctrp\") pod \"octavia-operator-controller-manager-7bd9774b6-qfb2q\" (UID: \"3b59a905-2607-4445-abee-ba43a1bdf41c\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.415000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djk6v\" (UniqueName: \"kubernetes.io/projected/064ac5cb-7d15-4502-b174-54236cdd0d51-kube-api-access-djk6v\") pod \"neutron-operator-controller-manager-5d8f59fb49-mhq65\" (UID: \"064ac5cb-7d15-4502-b174-54236cdd0d51\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.439956 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.441068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djk6v\" (UniqueName: \"kubernetes.io/projected/064ac5cb-7d15-4502-b174-54236cdd0d51-kube-api-access-djk6v\") pod \"neutron-operator-controller-manager-5d8f59fb49-mhq65\" (UID: \"064ac5cb-7d15-4502-b174-54236cdd0d51\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.442117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.445119 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6tx64" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.455083 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.455997 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.458443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f2qf4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.458587 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.461345 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.462462 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.466169 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5h6n8" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.469048 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.486087 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.512377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgcs\" (UniqueName: \"kubernetes.io/projected/ee1a4864-faf3-49da-ac1a-ab864c677803-kube-api-access-cwgcs\") pod \"placement-operator-controller-manager-5d646b7d76-wsnt4\" (UID: \"ee1a4864-faf3-49da-ac1a-ab864c677803\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516264 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctrp\" (UniqueName: \"kubernetes.io/projected/3b59a905-2607-4445-abee-ba43a1bdf41c-kube-api-access-6ctrp\") pod \"octavia-operator-controller-manager-7bd9774b6-qfb2q\" (UID: \"3b59a905-2607-4445-abee-ba43a1bdf41c\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khb2m\" (UniqueName: \"kubernetes.io/projected/6b4ae3c8-6f7f-4b76-91c0-3652f86422a6-kube-api-access-khb2m\") pod \"ovn-operator-controller-manager-55db956ddc-995fd\" (UID: \"6b4ae3c8-6f7f-4b76-91c0-3652f86422a6\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kms\" (UniqueName: \"kubernetes.io/projected/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-kube-api-access-79kms\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.516406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5s98\" (UniqueName: \"kubernetes.io/projected/d1e52325-1801-4650-86bd-c1eb8f076714-kube-api-access-w5s98\") pod \"nova-operator-controller-manager-6b8bc8d87d-mppfg\" (UID: \"d1e52325-1801-4650-86bd-c1eb8f076714\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.519964 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.520844 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.526060 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kdj4n" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.531009 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.531840 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.533125 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6fkps" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.559637 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctrp\" (UniqueName: \"kubernetes.io/projected/3b59a905-2607-4445-abee-ba43a1bdf41c-kube-api-access-6ctrp\") pod \"octavia-operator-controller-manager-7bd9774b6-qfb2q\" (UID: \"3b59a905-2607-4445-abee-ba43a1bdf41c\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.560501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5s98\" (UniqueName: \"kubernetes.io/projected/d1e52325-1801-4650-86bd-c1eb8f076714-kube-api-access-w5s98\") pod \"nova-operator-controller-manager-6b8bc8d87d-mppfg\" (UID: \"d1e52325-1801-4650-86bd-c1eb8f076714\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.562608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.562837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.569863 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.599398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.605908 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.606782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.608496 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.609853 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xxzkp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgcs\" (UniqueName: \"kubernetes.io/projected/ee1a4864-faf3-49da-ac1a-ab864c677803-kube-api-access-cwgcs\") pod \"placement-operator-controller-manager-5d646b7d76-wsnt4\" (UID: \"ee1a4864-faf3-49da-ac1a-ab864c677803\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8wws\" (UniqueName: \"kubernetes.io/projected/9d13aa32-eef2-427d-9398-507957b4c81c-kube-api-access-s8wws\") pod \"telemetry-operator-controller-manager-85cd9769bb-srnbc\" (UID: \"9d13aa32-eef2-427d-9398-507957b4c81c\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gxk\" (UniqueName: \"kubernetes.io/projected/cf7c633b-f013-4be6-a794-888a816a2ec2-kube-api-access-s4gxk\") pod \"swift-operator-controller-manager-547cbdb99f-nck8q\" (UID: \"cf7c633b-f013-4be6-a794-888a816a2ec2\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khb2m\" (UniqueName: \"kubernetes.io/projected/6b4ae3c8-6f7f-4b76-91c0-3652f86422a6-kube-api-access-khb2m\") pod \"ovn-operator-controller-manager-55db956ddc-995fd\" (UID: \"6b4ae3c8-6f7f-4b76-91c0-3652f86422a6\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.621757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kms\" (UniqueName: \"kubernetes.io/projected/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-kube-api-access-79kms\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.622146 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.622224 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert podName:4a00e91a-fbd8-496e-96e0-4fb25d7841fe nodeName:}" failed. No retries permitted until 2026-01-22 14:00:51.122199828 +0000 UTC m=+887.677242991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" (UID: "4a00e91a-fbd8-496e-96e0-4fb25d7841fe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.622371 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.622398 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert podName:7df36228-9543-4bb1-a0a7-d2ca51ac35a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:51.622388713 +0000 UTC m=+888.177431876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert") pod "infra-operator-controller-manager-54ccf4f85d-8xxtr" (UID: "7df36228-9543-4bb1-a0a7-d2ca51ac35a5") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.633168 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.639152 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.641030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.645545 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2b7q7" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.651758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kms\" (UniqueName: \"kubernetes.io/projected/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-kube-api-access-79kms\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.655943 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khb2m\" (UniqueName: \"kubernetes.io/projected/6b4ae3c8-6f7f-4b76-91c0-3652f86422a6-kube-api-access-khb2m\") pod \"ovn-operator-controller-manager-55db956ddc-995fd\" (UID: \"6b4ae3c8-6f7f-4b76-91c0-3652f86422a6\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.656890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgcs\" (UniqueName: \"kubernetes.io/projected/ee1a4864-faf3-49da-ac1a-ab864c677803-kube-api-access-cwgcs\") pod \"placement-operator-controller-manager-5d646b7d76-wsnt4\" (UID: \"ee1a4864-faf3-49da-ac1a-ab864c677803\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.676292 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.681319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.707332 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.723141 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8wws\" (UniqueName: \"kubernetes.io/projected/9d13aa32-eef2-427d-9398-507957b4c81c-kube-api-access-s8wws\") pod \"telemetry-operator-controller-manager-85cd9769bb-srnbc\" (UID: \"9d13aa32-eef2-427d-9398-507957b4c81c\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.723626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gxk\" (UniqueName: \"kubernetes.io/projected/cf7c633b-f013-4be6-a794-888a816a2ec2-kube-api-access-s4gxk\") pod \"swift-operator-controller-manager-547cbdb99f-nck8q\" (UID: \"cf7c633b-f013-4be6-a794-888a816a2ec2\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.723715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrt9h\" (UniqueName: \"kubernetes.io/projected/1cd779f7-75c7-4a5b-82f1-15a26703ed29-kube-api-access-qrt9h\") pod \"watcher-operator-controller-manager-5ffb9c6597-w4cch\" (UID: \"1cd779f7-75c7-4a5b-82f1-15a26703ed29\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.723769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbdjc\" (UniqueName: \"kubernetes.io/projected/d70fab64-d6ec-42ab-93ef-e882fc4d3f84-kube-api-access-kbdjc\") pod \"test-operator-controller-manager-69797bbcbd-b9bq2\" (UID: \"d70fab64-d6ec-42ab-93ef-e882fc4d3f84\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.724186 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.724989 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.729346 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.729408 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hvsxp" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.735181 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.735648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.744110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8wws\" (UniqueName: \"kubernetes.io/projected/9d13aa32-eef2-427d-9398-507957b4c81c-kube-api-access-s8wws\") pod \"telemetry-operator-controller-manager-85cd9769bb-srnbc\" (UID: \"9d13aa32-eef2-427d-9398-507957b4c81c\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.746375 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.747292 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.748133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gxk\" (UniqueName: \"kubernetes.io/projected/cf7c633b-f013-4be6-a794-888a816a2ec2-kube-api-access-s4gxk\") pod \"swift-operator-controller-manager-547cbdb99f-nck8q\" (UID: \"cf7c633b-f013-4be6-a794-888a816a2ec2\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.748768 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jpbbz" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.785202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.785862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.804577 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.825782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.825869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.825919 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k427s\" (UniqueName: \"kubernetes.io/projected/36be36d1-45a3-4e18-ba83-e4ae61363409-kube-api-access-k427s\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jzmrn\" (UID: \"36be36d1-45a3-4e18-ba83-e4ae61363409\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.825953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrt9h\" (UniqueName: \"kubernetes.io/projected/1cd779f7-75c7-4a5b-82f1-15a26703ed29-kube-api-access-qrt9h\") pod \"watcher-operator-controller-manager-5ffb9c6597-w4cch\" (UID: \"1cd779f7-75c7-4a5b-82f1-15a26703ed29\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.826024 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbdjc\" (UniqueName: \"kubernetes.io/projected/d70fab64-d6ec-42ab-93ef-e882fc4d3f84-kube-api-access-kbdjc\") pod \"test-operator-controller-manager-69797bbcbd-b9bq2\" (UID: \"d70fab64-d6ec-42ab-93ef-e882fc4d3f84\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.826050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpxhf\" (UniqueName: \"kubernetes.io/projected/0855131d-976e-4cb5-83bb-9e47417d78f5-kube-api-access-zpxhf\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.854242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrt9h\" (UniqueName: \"kubernetes.io/projected/1cd779f7-75c7-4a5b-82f1-15a26703ed29-kube-api-access-qrt9h\") pod \"watcher-operator-controller-manager-5ffb9c6597-w4cch\" (UID: \"1cd779f7-75c7-4a5b-82f1-15a26703ed29\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.859835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbdjc\" (UniqueName: \"kubernetes.io/projected/d70fab64-d6ec-42ab-93ef-e882fc4d3f84-kube-api-access-kbdjc\") pod \"test-operator-controller-manager-69797bbcbd-b9bq2\" (UID: \"d70fab64-d6ec-42ab-93ef-e882fc4d3f84\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.865964 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.889519 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.914732 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn"] Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.926961 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.927086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.927184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k427s\" (UniqueName: \"kubernetes.io/projected/36be36d1-45a3-4e18-ba83-e4ae61363409-kube-api-access-k427s\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jzmrn\" (UID: \"36be36d1-45a3-4e18-ba83-e4ae61363409\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.927270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpxhf\" (UniqueName: \"kubernetes.io/projected/0855131d-976e-4cb5-83bb-9e47417d78f5-kube-api-access-zpxhf\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.927994 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.929195 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.941039 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:51.441008778 +0000 UTC m=+887.996051941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: E0122 14:00:50.941097 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:51.441067019 +0000 UTC m=+887.996110182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.949251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpxhf\" (UniqueName: \"kubernetes.io/projected/0855131d-976e-4cb5-83bb-9e47417d78f5-kube-api-access-zpxhf\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.952465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k427s\" (UniqueName: \"kubernetes.io/projected/36be36d1-45a3-4e18-ba83-e4ae61363409-kube-api-access-k427s\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jzmrn\" (UID: \"36be36d1-45a3-4e18-ba83-e4ae61363409\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" Jan 22 14:00:50 crc kubenswrapper[4743]: I0122 14:00:50.978248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.045222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.071752 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.082952 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b9f418_b721_4fce_881e_791eceb6b0ef.slice/crio-3a8433178948dd87a9204925ed5470a2dc60330f4c307cf5adddabf161b5829b WatchSource:0}: Error finding container 3a8433178948dd87a9204925ed5470a2dc60330f4c307cf5adddabf161b5829b: Status 404 returned error can't find the container with id 3a8433178948dd87a9204925ed5470a2dc60330f4c307cf5adddabf161b5829b Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.104859 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.126923 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.145636 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.145934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.146053 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.146097 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert podName:4a00e91a-fbd8-496e-96e0-4fb25d7841fe nodeName:}" failed. No retries permitted until 2026-01-22 14:00:52.146083345 +0000 UTC m=+888.701126508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" (UID: "4a00e91a-fbd8-496e-96e0-4fb25d7841fe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.258635 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.269734 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.278882 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.282708 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e262e2d_6d13_4c04_9826_14ed89dde8ea.slice/crio-d00cd392a7abb02b2b7ab07355737d3b106e8768b66f4f2d08d8f18468f29629 WatchSource:0}: Error finding container d00cd392a7abb02b2b7ab07355737d3b106e8768b66f4f2d08d8f18468f29629: Status 404 returned error can't find the container with id d00cd392a7abb02b2b7ab07355737d3b106e8768b66f4f2d08d8f18468f29629 Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.285700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.369273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" event={"ID":"aff36600-9c00-4a26-b311-a3d743333b0e","Type":"ContainerStarted","Data":"c2c0d7843179a0bb6dffbb227a203b0346fd2d6c1d54aacd3411bc431a50036e"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.375166 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" event={"ID":"bad16498-5eda-4791-8577-6cf6ef07ca2a","Type":"ContainerStarted","Data":"3c814fd6d8b94beebf8327ba2e35eddb3031433d7c430be194be711275a08fc0"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.376271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" event={"ID":"3454a999-851a-47d1-ba12-64f77de4bd6a","Type":"ContainerStarted","Data":"25b45509d53199d6f4d374f02ebaae2bd08492fcb0f6c06352a3032714b8865b"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.377378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" event={"ID":"6221bb17-765b-4d72-8a74-70cdbc3447d9","Type":"ContainerStarted","Data":"1d4fc1176c04e19d3ab75ea00a309460dab3786d74c07ccbf02aa60a7262886e"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.378666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" event={"ID":"4eb53c43-8c71-4c15-862a-134fa6eb85d6","Type":"ContainerStarted","Data":"8157447014f9713be4f46941aeeb5b1460e749ef7ebb65a61a8dcc4429d649d8"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.379691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" event={"ID":"3e262e2d-6d13-4c04-9826-14ed89dde8ea","Type":"ContainerStarted","Data":"d00cd392a7abb02b2b7ab07355737d3b106e8768b66f4f2d08d8f18468f29629"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.381162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" event={"ID":"f6b9f418-b721-4fce-881e-791eceb6b0ef","Type":"ContainerStarted","Data":"3a8433178948dd87a9204925ed5470a2dc60330f4c307cf5adddabf161b5829b"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.382099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" event={"ID":"1fcc87bc-de60-44e2-b8b9-88c97eb2aec4","Type":"ContainerStarted","Data":"f11063bc91dbf29c4a21c9979d06930313739924096c7450f10180876e6bd45f"} Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.449217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.449273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.449447 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.449503 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:52.449483033 +0000 UTC m=+889.004526196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.449891 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.449929 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:52.449919114 +0000 UTC m=+889.004962277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.468480 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.493486 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96859e4c_bbb4_424b_bc02_2bd6e3b03484.slice/crio-3e0c66f5974048a4699a839dc193c3944b07c4e142198eed10ce1c6d74b6385b WatchSource:0}: Error finding container 3e0c66f5974048a4699a839dc193c3944b07c4e142198eed10ce1c6d74b6385b: Status 404 returned error can't find the container with id 3e0c66f5974048a4699a839dc193c3944b07c4e142198eed10ce1c6d74b6385b Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.509816 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.516172 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.539163 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.629875 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.636191 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.637994 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4ae3c8_6f7f_4b76_91c0_3652f86422a6.slice/crio-7cd075a32608fce0e5fefb1b16275f42089540275e5ae19ec99bbaa782268cc8 WatchSource:0}: Error finding container 7cd075a32608fce0e5fefb1b16275f42089540275e5ae19ec99bbaa782268cc8: Status 404 returned error can't find the container with id 7cd075a32608fce0e5fefb1b16275f42089540275e5ae19ec99bbaa782268cc8 Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.652641 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s8wws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-85cd9769bb-srnbc_openstack-operators(9d13aa32-eef2-427d-9398-507957b4c81c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.652851 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cwgcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5d646b7d76-wsnt4_openstack-operators(ee1a4864-faf3-49da-ac1a-ab864c677803): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.652984 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khb2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-995fd_openstack-operators(6b4ae3c8-6f7f-4b76-91c0-3652f86422a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.654052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.655952 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.656016 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" podUID="6b4ae3c8-6f7f-4b76-91c0-3652f86422a6" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.658524 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert podName:7df36228-9543-4bb1-a0a7-d2ca51ac35a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:53.658491865 +0000 UTC m=+890.213535018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert") pod "infra-operator-controller-manager-54ccf4f85d-8xxtr" (UID: "7df36228-9543-4bb1-a0a7-d2ca51ac35a5") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.656050 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" podUID="ee1a4864-faf3-49da-ac1a-ab864c677803" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.656038 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" podUID="9d13aa32-eef2-427d-9398-507957b4c81c" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.661219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.668161 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.776140 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7c633b_f013_4be6_a794_888a816a2ec2.slice/crio-e2fb8c2ee5b643d7b363765f34b311117bfec0d3c5878cb2a5e8a7f828c10fa8 WatchSource:0}: Error finding container e2fb8c2ee5b643d7b363765f34b311117bfec0d3c5878cb2a5e8a7f828c10fa8: Status 404 returned error can't find the container with id e2fb8c2ee5b643d7b363765f34b311117bfec0d3c5878cb2a5e8a7f828c10fa8 Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.777765 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4gxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-nck8q_openstack-operators(cf7c633b-f013-4be6-a794-888a816a2ec2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.778938 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" podUID="cf7c633b-f013-4be6-a794-888a816a2ec2" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.779699 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrt9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5ffb9c6597-w4cch_openstack-operators(1cd779f7-75c7-4a5b-82f1-15a26703ed29): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.780844 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" podUID="1cd779f7-75c7-4a5b-82f1-15a26703ed29" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.784149 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbdjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-69797bbcbd-b9bq2_openstack-operators(d70fab64-d6ec-42ab-93ef-e882fc4d3f84): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:00:51 crc kubenswrapper[4743]: E0122 14:00:51.785343 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" podUID="d70fab64-d6ec-42ab-93ef-e882fc4d3f84" Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.787197 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.787447 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.792222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch"] Jan 22 14:00:51 crc kubenswrapper[4743]: I0122 14:00:51.822911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn"] Jan 22 14:00:51 crc kubenswrapper[4743]: W0122 14:00:51.828340 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36be36d1_45a3_4e18_ba83_e4ae61363409.slice/crio-f766d7ea8e1d98ce56420f002fc2d7bbe826b0721b23297d429cf40398782217 WatchSource:0}: Error finding container f766d7ea8e1d98ce56420f002fc2d7bbe826b0721b23297d429cf40398782217: Status 404 returned error can't find the container with id f766d7ea8e1d98ce56420f002fc2d7bbe826b0721b23297d429cf40398782217 Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.182726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.182960 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.183037 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert podName:4a00e91a-fbd8-496e-96e0-4fb25d7841fe nodeName:}" failed. No retries permitted until 2026-01-22 14:00:54.183017609 +0000 UTC m=+890.738060772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" (UID: "4a00e91a-fbd8-496e-96e0-4fb25d7841fe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.392872 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" event={"ID":"064ac5cb-7d15-4502-b174-54236cdd0d51","Type":"ContainerStarted","Data":"738d314a80e47460218856e6eab58504530886887bc345b166470e9ab419c340"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.395569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" event={"ID":"d70fab64-d6ec-42ab-93ef-e882fc4d3f84","Type":"ContainerStarted","Data":"a17838a9ebf1997b2d86734db9ef4b6c31b30fe8a64ff47e554e0fd08070b77f"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.397933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" event={"ID":"ee1a4864-faf3-49da-ac1a-ab864c677803","Type":"ContainerStarted","Data":"88798e488c412718166661c61500367cdea334970c56ef9e20d7b7a99d2f1c6c"} Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.399619 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" podUID="d70fab64-d6ec-42ab-93ef-e882fc4d3f84" Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.400210 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" podUID="ee1a4864-faf3-49da-ac1a-ab864c677803" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.400485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" event={"ID":"36be36d1-45a3-4e18-ba83-e4ae61363409","Type":"ContainerStarted","Data":"f766d7ea8e1d98ce56420f002fc2d7bbe826b0721b23297d429cf40398782217"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.407719 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" event={"ID":"d1e52325-1801-4650-86bd-c1eb8f076714","Type":"ContainerStarted","Data":"7ceeda04ede393622e3b425caa9bc5f3b6a804f51ccf635ee2baf000d7659042"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.415141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" event={"ID":"1cd779f7-75c7-4a5b-82f1-15a26703ed29","Type":"ContainerStarted","Data":"b31227a0b04f89b23b69af10a96747c09310a21a0ff36fddd8f9c4f5589a5bbb"} Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.416572 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" podUID="1cd779f7-75c7-4a5b-82f1-15a26703ed29" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.418696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" event={"ID":"3b59a905-2607-4445-abee-ba43a1bdf41c","Type":"ContainerStarted","Data":"efb4c280ca00dc2c74071ba21748d3fc24b5e8628946c8415646375aa1948bee"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.420214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" event={"ID":"924b89fa-b3de-46d6-b9c8-5be5e6d4795c","Type":"ContainerStarted","Data":"df5c22d0c9aae58dfeb69699f95e6334b8f8dca6051939a3690776498dfcb630"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.447702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" event={"ID":"9d13aa32-eef2-427d-9398-507957b4c81c","Type":"ContainerStarted","Data":"a51b337d4aa1fbae45b8a5deb80edbe2d2cd96ca51bcf985ad6deb85cd4b5d1f"} Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.449231 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" podUID="9d13aa32-eef2-427d-9398-507957b4c81c" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.450928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" event={"ID":"6b4ae3c8-6f7f-4b76-91c0-3652f86422a6","Type":"ContainerStarted","Data":"7cd075a32608fce0e5fefb1b16275f42089540275e5ae19ec99bbaa782268cc8"} Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.452236 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" podUID="6b4ae3c8-6f7f-4b76-91c0-3652f86422a6" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.453925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" event={"ID":"cf7c633b-f013-4be6-a794-888a816a2ec2","Type":"ContainerStarted","Data":"e2fb8c2ee5b643d7b363765f34b311117bfec0d3c5878cb2a5e8a7f828c10fa8"} Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.458156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" podUID="cf7c633b-f013-4be6-a794-888a816a2ec2" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.458832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" event={"ID":"96859e4c-bbb4-424b-bc02-2bd6e3b03484","Type":"ContainerStarted","Data":"3e0c66f5974048a4699a839dc193c3944b07c4e142198eed10ce1c6d74b6385b"} Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.486911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:52 crc kubenswrapper[4743]: I0122 14:00:52.487143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.487357 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.487442 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.487561 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:54.487467925 +0000 UTC m=+891.042511088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:00:52 crc kubenswrapper[4743]: E0122 14:00:52.487706 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:54.487692801 +0000 UTC m=+891.042735974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.471201 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" podUID="d70fab64-d6ec-42ab-93ef-e882fc4d3f84" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.471612 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" podUID="1cd779f7-75c7-4a5b-82f1-15a26703ed29" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.472485 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" podUID="ee1a4864-faf3-49da-ac1a-ab864c677803" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.473978 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" podUID="6b4ae3c8-6f7f-4b76-91c0-3652f86422a6" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.474038 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" podUID="9d13aa32-eef2-427d-9398-507957b4c81c" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.476909 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" podUID="cf7c633b-f013-4be6-a794-888a816a2ec2" Jan 22 14:00:53 crc kubenswrapper[4743]: I0122 14:00:53.703263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.703440 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:53 crc kubenswrapper[4743]: E0122 14:00:53.703511 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert podName:7df36228-9543-4bb1-a0a7-d2ca51ac35a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:57.70349199 +0000 UTC m=+894.258535153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert") pod "infra-operator-controller-manager-54ccf4f85d-8xxtr" (UID: "7df36228-9543-4bb1-a0a7-d2ca51ac35a5") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: I0122 14:00:54.222972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.223165 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.223401 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert podName:4a00e91a-fbd8-496e-96e0-4fb25d7841fe nodeName:}" failed. No retries permitted until 2026-01-22 14:00:58.223384581 +0000 UTC m=+894.778427744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" (UID: "4a00e91a-fbd8-496e-96e0-4fb25d7841fe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: I0122 14:00:54.527449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:54 crc kubenswrapper[4743]: I0122 14:00:54.527497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.527598 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.527645 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:58.527631501 +0000 UTC m=+895.082674664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.527598 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:00:54 crc kubenswrapper[4743]: E0122 14:00:54.527716 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:00:58.527706283 +0000 UTC m=+895.082749446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:00:57 crc kubenswrapper[4743]: I0122 14:00:57.788553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:00:57 crc kubenswrapper[4743]: E0122 14:00:57.788731 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:57 crc kubenswrapper[4743]: E0122 14:00:57.788989 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert podName:7df36228-9543-4bb1-a0a7-d2ca51ac35a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:01:05.788969551 +0000 UTC m=+902.344012714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert") pod "infra-operator-controller-manager-54ccf4f85d-8xxtr" (UID: "7df36228-9543-4bb1-a0a7-d2ca51ac35a5") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: I0122 14:00:58.299107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.299284 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.299573 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert podName:4a00e91a-fbd8-496e-96e0-4fb25d7841fe nodeName:}" failed. No retries permitted until 2026-01-22 14:01:06.299556063 +0000 UTC m=+902.854599226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" (UID: "4a00e91a-fbd8-496e-96e0-4fb25d7841fe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: I0122 14:00:58.604306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:58 crc kubenswrapper[4743]: I0122 14:00:58.604381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.604562 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.604655 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:01:06.604631495 +0000 UTC m=+903.159674658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.604571 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:00:58 crc kubenswrapper[4743]: E0122 14:00:58.604710 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:01:06.604700907 +0000 UTC m=+903.159744280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:01:00 crc kubenswrapper[4743]: I0122 14:01:00.049082 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:01:00 crc kubenswrapper[4743]: I0122 14:01:00.049134 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:01:03 crc kubenswrapper[4743]: E0122 14:01:03.891374 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 22 14:01:03 crc kubenswrapper[4743]: E0122 14:01:03.891826 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4v5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-mxdkr_openstack-operators(bad16498-5eda-4791-8577-6cf6ef07ca2a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:01:03 crc kubenswrapper[4743]: E0122 14:01:03.893132 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" podUID="bad16498-5eda-4791-8577-6cf6ef07ca2a" Jan 22 14:01:04 crc kubenswrapper[4743]: E0122 14:01:04.539103 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5" Jan 22 14:01:04 crc kubenswrapper[4743]: E0122 14:01:04.539321 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6ctrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7bd9774b6-qfb2q_openstack-operators(3b59a905-2607-4445-abee-ba43a1bdf41c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:01:04 crc kubenswrapper[4743]: E0122 14:01:04.541361 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" podUID="3b59a905-2607-4445-abee-ba43a1bdf41c" Jan 22 14:01:04 crc kubenswrapper[4743]: E0122 14:01:04.548008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" podUID="bad16498-5eda-4791-8577-6cf6ef07ca2a" Jan 22 14:01:05 crc kubenswrapper[4743]: E0122 14:01:05.556289 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" podUID="3b59a905-2607-4445-abee-ba43a1bdf41c" Jan 22 14:01:05 crc kubenswrapper[4743]: I0122 14:01:05.808463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:01:05 crc kubenswrapper[4743]: I0122 14:01:05.815296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7df36228-9543-4bb1-a0a7-d2ca51ac35a5-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-8xxtr\" (UID: \"7df36228-9543-4bb1-a0a7-d2ca51ac35a5\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:01:05 crc kubenswrapper[4743]: I0122 14:01:05.849305 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7q6zn" Jan 22 14:01:05 crc kubenswrapper[4743]: I0122 14:01:05.857564 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.110425 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.110608 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w5s98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-6b8bc8d87d-mppfg_openstack-operators(d1e52325-1801-4650-86bd-c1eb8f076714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.112879 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" podUID="d1e52325-1801-4650-86bd-c1eb8f076714" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.317843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.343228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a00e91a-fbd8-496e-96e0-4fb25d7841fe-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q5znv\" (UID: \"4a00e91a-fbd8-496e-96e0-4fb25d7841fe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.402852 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f2qf4" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.409627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.564036 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" podUID="d1e52325-1801-4650-86bd-c1eb8f076714" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.622765 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:06 crc kubenswrapper[4743]: I0122 14:01:06.622849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.622995 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.623098 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:01:22.623063976 +0000 UTC m=+919.178107139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "metrics-server-cert" not found Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.623185 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.623282 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs podName:0855131d-976e-4cb5-83bb-9e47417d78f5 nodeName:}" failed. No retries permitted until 2026-01-22 14:01:22.623260831 +0000 UTC m=+919.178304184 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs") pod "openstack-operator-controller-manager-cdc5d4c7b-hk8dd" (UID: "0855131d-976e-4cb5-83bb-9e47417d78f5") : secret "webhook-server-cert" not found Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.867170 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.867514 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w8lvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b8b6d4659-24n5k_openstack-operators(3e262e2d-6d13-4c04-9826-14ed89dde8ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:01:06 crc kubenswrapper[4743]: E0122 14:01:06.868709 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" podUID="3e262e2d-6d13-4c04-9826-14ed89dde8ea" Jan 22 14:01:07 crc kubenswrapper[4743]: E0122 14:01:07.571767 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" podUID="3e262e2d-6d13-4c04-9826-14ed89dde8ea" Jan 22 14:01:14 crc kubenswrapper[4743]: E0122 14:01:14.514432 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 22 14:01:14 crc kubenswrapper[4743]: E0122 14:01:14.515693 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k427s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-jzmrn_openstack-operators(36be36d1-45a3-4e18-ba83-e4ae61363409): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:01:14 crc kubenswrapper[4743]: E0122 14:01:14.517943 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" podUID="36be36d1-45a3-4e18-ba83-e4ae61363409" Jan 22 14:01:14 crc kubenswrapper[4743]: E0122 14:01:14.619711 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" podUID="36be36d1-45a3-4e18-ba83-e4ae61363409" Jan 22 14:01:16 crc kubenswrapper[4743]: I0122 14:01:16.547665 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr"] Jan 22 14:01:17 crc kubenswrapper[4743]: W0122 14:01:17.278651 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df36228_9543_4bb1_a0a7_d2ca51ac35a5.slice/crio-2a9dbfadc6b6c68ad0e338573316ce32837018d0e501a46b86a0d17385a4cbba WatchSource:0}: Error finding container 2a9dbfadc6b6c68ad0e338573316ce32837018d0e501a46b86a0d17385a4cbba: Status 404 returned error can't find the container with id 2a9dbfadc6b6c68ad0e338573316ce32837018d0e501a46b86a0d17385a4cbba Jan 22 14:01:17 crc kubenswrapper[4743]: I0122 14:01:17.303944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv"] Jan 22 14:01:17 crc kubenswrapper[4743]: I0122 14:01:17.634386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" event={"ID":"7df36228-9543-4bb1-a0a7-d2ca51ac35a5","Type":"ContainerStarted","Data":"2a9dbfadc6b6c68ad0e338573316ce32837018d0e501a46b86a0d17385a4cbba"} Jan 22 14:01:18 crc kubenswrapper[4743]: I0122 14:01:18.645176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" event={"ID":"4a00e91a-fbd8-496e-96e0-4fb25d7841fe","Type":"ContainerStarted","Data":"a222358af09310fd3f5829ee0d05d386970de5efdc99c8adc8a8976f57302cf2"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.652605 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" event={"ID":"9d13aa32-eef2-427d-9398-507957b4c81c","Type":"ContainerStarted","Data":"6f61e004a6f758bb39aaaab10026990041bddf3f40cb52f08bec5a02d0ddac5e"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.653340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.656024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" event={"ID":"1fcc87bc-de60-44e2-b8b9-88c97eb2aec4","Type":"ContainerStarted","Data":"a06258b119c7c1e4a0c45bad7520019f3a0b613217c37b6996189d7e52cc01cc"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.656203 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.658514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" event={"ID":"aff36600-9c00-4a26-b311-a3d743333b0e","Type":"ContainerStarted","Data":"b456837d39fa1253f355ad04b70643f6464a15dd5efc77ea4a4298c1453696ef"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.658999 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.663286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" event={"ID":"d70fab64-d6ec-42ab-93ef-e882fc4d3f84","Type":"ContainerStarted","Data":"057016ba8f968a882c5906ded4b3f225737a37f7f8a6dcc7a64e9313d8cb38a8"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.663712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.667280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" event={"ID":"4eb53c43-8c71-4c15-862a-134fa6eb85d6","Type":"ContainerStarted","Data":"492d620b623ac8c71d1142d9affecd8227b6aaf7c7d0557d8c9d98fd4d932e9e"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.667773 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.676931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" event={"ID":"6b4ae3c8-6f7f-4b76-91c0-3652f86422a6","Type":"ContainerStarted","Data":"30bb060e458070279fa6962218d8073d0b9be2f5268d1224ddb1ec6941e495fa"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.677312 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.679297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" event={"ID":"6221bb17-765b-4d72-8a74-70cdbc3447d9","Type":"ContainerStarted","Data":"d3ab2354a8e061e21948529752fe8fa20da299ac001e6aab0471519b82f09eba"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.679618 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.688622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" event={"ID":"064ac5cb-7d15-4502-b174-54236cdd0d51","Type":"ContainerStarted","Data":"9be7a2ef3cb9d0f44862362bf41d0a0432ed9b8d1e2fe7bafa704a6f5d17335c"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.688759 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.701213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" event={"ID":"924b89fa-b3de-46d6-b9c8-5be5e6d4795c","Type":"ContainerStarted","Data":"e2f21b7d8bee15adcc77721e3286e8771093d17ad4ac75d2d8280ee0329dbf45"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.701576 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.714696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" event={"ID":"f6b9f418-b721-4fce-881e-791eceb6b0ef","Type":"ContainerStarted","Data":"23117549bf2b63ba6dac031e91a8a5d178269edbc56cc4b7c9e19d127d9df835"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.714914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.717474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" event={"ID":"1cd779f7-75c7-4a5b-82f1-15a26703ed29","Type":"ContainerStarted","Data":"93a7d7e603e310c836e77dbfd781b0b674e187b6558309abbe7e5f0d6d80a11b"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.717869 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.719216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" event={"ID":"ee1a4864-faf3-49da-ac1a-ab864c677803","Type":"ContainerStarted","Data":"6fb9402293380e463ffede5b77d19a7b23a7d876f0172b1a29c4568003d563e4"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.719545 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.720693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" event={"ID":"3454a999-851a-47d1-ba12-64f77de4bd6a","Type":"ContainerStarted","Data":"ad88862079d0730157205acccdf62ae1384e2fc53ea1331bf2cac579b00844f3"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.721064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.723532 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" event={"ID":"cf7c633b-f013-4be6-a794-888a816a2ec2","Type":"ContainerStarted","Data":"d10f85a08135e2e307dd646fe7af2993033d6e2e9ed650ee9d8ba0bc9807b9ba"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.723930 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.725184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" event={"ID":"96859e4c-bbb4-424b-bc02-2bd6e3b03484","Type":"ContainerStarted","Data":"dff54c6f8a467d9695ec474c4f95f6103b6b1b8db80324d4be96811eb7a71434"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.725522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.726718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" event={"ID":"bad16498-5eda-4791-8577-6cf6ef07ca2a","Type":"ContainerStarted","Data":"e1a5710893ed1fb13f9546ead5ad3d472ebb452f37b97010329952f16b05cca7"} Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.733263 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" podStartSLOduration=2.536373684 podStartE2EDuration="29.733249261s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.652435133 +0000 UTC m=+888.207478306" lastFinishedPulling="2026-01-22 14:01:18.84931072 +0000 UTC m=+915.404353883" observedRunningTime="2026-01-22 14:01:19.686804458 +0000 UTC m=+916.241847631" watchObservedRunningTime="2026-01-22 14:01:19.733249261 +0000 UTC m=+916.288292424" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.733518 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" podStartSLOduration=6.823466192 podStartE2EDuration="30.733513588s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.171106764 +0000 UTC m=+887.726149927" lastFinishedPulling="2026-01-22 14:01:15.08115412 +0000 UTC m=+911.636197323" observedRunningTime="2026-01-22 14:01:19.724661711 +0000 UTC m=+916.279704864" watchObservedRunningTime="2026-01-22 14:01:19.733513588 +0000 UTC m=+916.288556751" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.734482 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.830866 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" podStartSLOduration=7.046636533 podStartE2EDuration="30.830844152s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.295931264 +0000 UTC m=+887.850974427" lastFinishedPulling="2026-01-22 14:01:15.080138883 +0000 UTC m=+911.635182046" observedRunningTime="2026-01-22 14:01:19.784076131 +0000 UTC m=+916.339119294" watchObservedRunningTime="2026-01-22 14:01:19.830844152 +0000 UTC m=+916.385887315" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.832103 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" podStartSLOduration=6.935937292 podStartE2EDuration="30.832095656s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.184945325 +0000 UTC m=+887.739988488" lastFinishedPulling="2026-01-22 14:01:15.081103699 +0000 UTC m=+911.636146852" observedRunningTime="2026-01-22 14:01:19.829127646 +0000 UTC m=+916.384170809" watchObservedRunningTime="2026-01-22 14:01:19.832095656 +0000 UTC m=+916.387138829" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.860593 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" podStartSLOduration=15.271108207 podStartE2EDuration="30.860577978s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.296611302 +0000 UTC m=+887.851654465" lastFinishedPulling="2026-01-22 14:01:06.886081073 +0000 UTC m=+903.441124236" observedRunningTime="2026-01-22 14:01:19.859092488 +0000 UTC m=+916.414135681" watchObservedRunningTime="2026-01-22 14:01:19.860577978 +0000 UTC m=+916.415621141" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.900691 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" podStartSLOduration=2.8843199139999998 podStartE2EDuration="29.900676111s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.784015214 +0000 UTC m=+888.339058377" lastFinishedPulling="2026-01-22 14:01:18.800371411 +0000 UTC m=+915.355414574" observedRunningTime="2026-01-22 14:01:19.899090178 +0000 UTC m=+916.454133341" watchObservedRunningTime="2026-01-22 14:01:19.900676111 +0000 UTC m=+916.455719274" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.933267 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" podStartSLOduration=2.751619903 podStartE2EDuration="29.933251792s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.652911496 +0000 UTC m=+888.207954679" lastFinishedPulling="2026-01-22 14:01:18.834543405 +0000 UTC m=+915.389586568" observedRunningTime="2026-01-22 14:01:19.925001762 +0000 UTC m=+916.480044925" watchObservedRunningTime="2026-01-22 14:01:19.933251792 +0000 UTC m=+916.488294955" Jan 22 14:01:19 crc kubenswrapper[4743]: I0122 14:01:19.960129 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" podStartSLOduration=2.906065886 podStartE2EDuration="29.9600561s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.777644763 +0000 UTC m=+888.332687926" lastFinishedPulling="2026-01-22 14:01:18.831634977 +0000 UTC m=+915.386678140" observedRunningTime="2026-01-22 14:01:19.948148291 +0000 UTC m=+916.503191474" watchObservedRunningTime="2026-01-22 14:01:19.9600561 +0000 UTC m=+916.515099273" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.018810 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" podStartSLOduration=7.024820051 podStartE2EDuration="31.018773151s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.086210903 +0000 UTC m=+887.641254106" lastFinishedPulling="2026-01-22 14:01:15.080164043 +0000 UTC m=+911.635207206" observedRunningTime="2026-01-22 14:01:20.00456212 +0000 UTC m=+916.559605283" watchObservedRunningTime="2026-01-22 14:01:20.018773151 +0000 UTC m=+916.573816314" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.075776 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" podStartSLOduration=6.481660523 podStartE2EDuration="30.075761105s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.486987306 +0000 UTC m=+888.042030469" lastFinishedPulling="2026-01-22 14:01:15.081087888 +0000 UTC m=+911.636131051" observedRunningTime="2026-01-22 14:01:20.038447737 +0000 UTC m=+916.593490910" watchObservedRunningTime="2026-01-22 14:01:20.075761105 +0000 UTC m=+916.630804268" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.077823 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" podStartSLOduration=15.216417874 podStartE2EDuration="31.07781642s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.024310217 +0000 UTC m=+887.579353380" lastFinishedPulling="2026-01-22 14:01:06.885708763 +0000 UTC m=+903.440751926" observedRunningTime="2026-01-22 14:01:20.076176176 +0000 UTC m=+916.631219339" watchObservedRunningTime="2026-01-22 14:01:20.07781642 +0000 UTC m=+916.632859583" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.105140 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" podStartSLOduration=2.957174283 podStartE2EDuration="30.105122871s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.652721751 +0000 UTC m=+888.207764934" lastFinishedPulling="2026-01-22 14:01:18.800670359 +0000 UTC m=+915.355713522" observedRunningTime="2026-01-22 14:01:20.104045592 +0000 UTC m=+916.659088755" watchObservedRunningTime="2026-01-22 14:01:20.105122871 +0000 UTC m=+916.660166034" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.130129 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" podStartSLOduration=3.1056908500000002 podStartE2EDuration="31.130109479s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.298528754 +0000 UTC m=+887.853571917" lastFinishedPulling="2026-01-22 14:01:19.322947383 +0000 UTC m=+915.877990546" observedRunningTime="2026-01-22 14:01:20.127001296 +0000 UTC m=+916.682044459" watchObservedRunningTime="2026-01-22 14:01:20.130109479 +0000 UTC m=+916.685152642" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.146117 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" podStartSLOduration=7.593981059 podStartE2EDuration="31.146101317s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.506480508 +0000 UTC m=+888.061523681" lastFinishedPulling="2026-01-22 14:01:15.058600776 +0000 UTC m=+911.613643939" observedRunningTime="2026-01-22 14:01:20.142440159 +0000 UTC m=+916.697483322" watchObservedRunningTime="2026-01-22 14:01:20.146101317 +0000 UTC m=+916.701144470" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.192727 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" podStartSLOduration=7.618860385 podStartE2EDuration="31.192711165s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.484748596 +0000 UTC m=+888.039791759" lastFinishedPulling="2026-01-22 14:01:15.058599366 +0000 UTC m=+911.613642539" observedRunningTime="2026-01-22 14:01:20.183084127 +0000 UTC m=+916.738127290" watchObservedRunningTime="2026-01-22 14:01:20.192711165 +0000 UTC m=+916.747754328" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.213171 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" podStartSLOduration=4.731921316 podStartE2EDuration="30.213153101s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.779590495 +0000 UTC m=+888.334633658" lastFinishedPulling="2026-01-22 14:01:17.26082228 +0000 UTC m=+913.815865443" observedRunningTime="2026-01-22 14:01:20.212810662 +0000 UTC m=+916.767853825" watchObservedRunningTime="2026-01-22 14:01:20.213153101 +0000 UTC m=+916.768196264" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.738247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" event={"ID":"3b59a905-2607-4445-abee-ba43a1bdf41c","Type":"ContainerStarted","Data":"4e29ca8d666118616401a24a6426b5134f015dc01f74479e38546dfe59861c8b"} Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.743006 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:01:20 crc kubenswrapper[4743]: I0122 14:01:20.760236 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" podStartSLOduration=2.849144082 podStartE2EDuration="30.760217729s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.498507724 +0000 UTC m=+888.053550887" lastFinishedPulling="2026-01-22 14:01:19.409581371 +0000 UTC m=+915.964624534" observedRunningTime="2026-01-22 14:01:20.757196508 +0000 UTC m=+917.312239671" watchObservedRunningTime="2026-01-22 14:01:20.760217729 +0000 UTC m=+917.315260892" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.634990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.635999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.642746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-webhook-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.660159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0855131d-976e-4cb5-83bb-9e47417d78f5-metrics-certs\") pod \"openstack-operator-controller-manager-cdc5d4c7b-hk8dd\" (UID: \"0855131d-976e-4cb5-83bb-9e47417d78f5\") " pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.756289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" event={"ID":"3e262e2d-6d13-4c04-9826-14ed89dde8ea","Type":"ContainerStarted","Data":"0d8aa22944f5a06ba53314e5a353f0479111dad2f97e5d9f558e3ccc47743614"} Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.756481 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.772425 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" podStartSLOduration=3.877202262 podStartE2EDuration="33.772378955s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.292954035 +0000 UTC m=+887.847997198" lastFinishedPulling="2026-01-22 14:01:21.188130728 +0000 UTC m=+917.743173891" observedRunningTime="2026-01-22 14:01:22.76955224 +0000 UTC m=+919.324595403" watchObservedRunningTime="2026-01-22 14:01:22.772378955 +0000 UTC m=+919.327422138" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.895528 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hvsxp" Jan 22 14:01:22 crc kubenswrapper[4743]: I0122 14:01:22.904686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.500548 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd"] Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.765528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" event={"ID":"0855131d-976e-4cb5-83bb-9e47417d78f5","Type":"ContainerStarted","Data":"2af9269162ea30ded3267faba9e9c2263095336049f712faa8b167e50e7b7d2d"} Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.765891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" event={"ID":"0855131d-976e-4cb5-83bb-9e47417d78f5","Type":"ContainerStarted","Data":"511d620b1c8e566bd702aab75855c03953df4460d09d464293e71c444c024d90"} Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.768201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" event={"ID":"4a00e91a-fbd8-496e-96e0-4fb25d7841fe","Type":"ContainerStarted","Data":"84eb49eca19801ae0c451991461d599f4b7f10bc0e3e9e9d015caf2cb0c902db"} Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.768685 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.774240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" event={"ID":"d1e52325-1801-4650-86bd-c1eb8f076714","Type":"ContainerStarted","Data":"7a47fd3added454d4edc94a0a9e0c44df7bb7baff3751de8e630921ed30fce9c"} Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.774624 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.777212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" event={"ID":"7df36228-9543-4bb1-a0a7-d2ca51ac35a5","Type":"ContainerStarted","Data":"1459a2915e42c1ecf8bac0114d7245b1c502cc6d110389a354919c4880c80c05"} Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.892349 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" podStartSLOduration=33.892335451 podStartE2EDuration="33.892335451s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:01:23.889491175 +0000 UTC m=+920.444534338" watchObservedRunningTime="2026-01-22 14:01:23.892335451 +0000 UTC m=+920.447378614" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.917065 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" podStartSLOduration=2.579097847 podStartE2EDuration="33.917046802s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.647986784 +0000 UTC m=+888.203029947" lastFinishedPulling="2026-01-22 14:01:22.985935749 +0000 UTC m=+919.540978902" observedRunningTime="2026-01-22 14:01:23.912782308 +0000 UTC m=+920.467825461" watchObservedRunningTime="2026-01-22 14:01:23.917046802 +0000 UTC m=+920.472089965" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.948643 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" podStartSLOduration=29.69287631 podStartE2EDuration="33.948621777s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:01:18.727549602 +0000 UTC m=+915.282592795" lastFinishedPulling="2026-01-22 14:01:22.983295099 +0000 UTC m=+919.538338262" observedRunningTime="2026-01-22 14:01:23.94162151 +0000 UTC m=+920.496664683" watchObservedRunningTime="2026-01-22 14:01:23.948621777 +0000 UTC m=+920.503664940" Jan 22 14:01:23 crc kubenswrapper[4743]: I0122 14:01:23.965623 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" podStartSLOduration=29.28378656 podStartE2EDuration="34.965599831s" podCreationTimestamp="2026-01-22 14:00:49 +0000 UTC" firstStartedPulling="2026-01-22 14:01:17.28549074 +0000 UTC m=+913.840533903" lastFinishedPulling="2026-01-22 14:01:22.967304001 +0000 UTC m=+919.522347174" observedRunningTime="2026-01-22 14:01:23.958688266 +0000 UTC m=+920.513731429" watchObservedRunningTime="2026-01-22 14:01:23.965599831 +0000 UTC m=+920.520642994" Jan 22 14:01:24 crc kubenswrapper[4743]: I0122 14:01:24.783368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:24 crc kubenswrapper[4743]: I0122 14:01:24.783412 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:01:28 crc kubenswrapper[4743]: I0122 14:01:28.814523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" event={"ID":"36be36d1-45a3-4e18-ba83-e4ae61363409","Type":"ContainerStarted","Data":"4b036ce6a167bf8df2dc10e5b8fc079db9038e646f48d3d3c96b56e63927d0b3"} Jan 22 14:01:28 crc kubenswrapper[4743]: I0122 14:01:28.833127 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jzmrn" podStartSLOduration=2.2667487299999998 podStartE2EDuration="38.833104386s" podCreationTimestamp="2026-01-22 14:00:50 +0000 UTC" firstStartedPulling="2026-01-22 14:00:51.831105564 +0000 UTC m=+888.386148727" lastFinishedPulling="2026-01-22 14:01:28.39746122 +0000 UTC m=+924.952504383" observedRunningTime="2026-01-22 14:01:28.828262526 +0000 UTC m=+925.383305689" watchObservedRunningTime="2026-01-22 14:01:28.833104386 +0000 UTC m=+925.388147549" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.049042 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.049176 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.049266 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.050671 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.050810 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc" gracePeriod=600 Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.143899 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-dn2mv" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.152295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-6kvwx" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.180242 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-w8s2s" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.191045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-mr8bn" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.214081 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-mxdkr" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.248744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f7dhp" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.281550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-nqr46" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.326269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-24n5k" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.566697 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jtwg6" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.642556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-b77x5" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.644645 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-mhq65" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.688861 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-mppfg" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.689693 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-qfb2q" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.791598 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-995fd" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.817443 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-wsnt4" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.839815 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc" exitCode=0 Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.839886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc"} Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.839917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324"} Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.839935 4743 scope.go:117] "RemoveContainer" containerID="58ca9bbd26d5eab47a0ae4b9a18e996aaf71b3e08e86fac81e851949e21bd947" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.872481 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-nck8q" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.897474 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-srnbc" Jan 22 14:01:30 crc kubenswrapper[4743]: I0122 14:01:30.981528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-b9bq2" Jan 22 14:01:31 crc kubenswrapper[4743]: I0122 14:01:31.049403 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-w4cch" Jan 22 14:01:32 crc kubenswrapper[4743]: I0122 14:01:32.912649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-cdc5d4c7b-hk8dd" Jan 22 14:01:35 crc kubenswrapper[4743]: I0122 14:01:35.866187 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-8xxtr" Jan 22 14:01:36 crc kubenswrapper[4743]: I0122 14:01:36.415677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q5znv" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.101230 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.102698 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.105238 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.105389 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.105682 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-m7tp9" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.106328 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.119692 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.160819 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.161872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.164212 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.182324 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.226755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.226852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8tz2\" (UniqueName: \"kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.328325 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.328397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.328441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8tz2\" (UniqueName: \"kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.328498 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgmc\" (UniqueName: \"kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.328581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.330130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.349703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8tz2\" (UniqueName: \"kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2\") pod \"dnsmasq-dns-675f4bcbfc-mxxkp\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.418573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.434999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.435093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.435172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgmc\" (UniqueName: \"kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.436602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.437513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.464812 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgmc\" (UniqueName: \"kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc\") pod \"dnsmasq-dns-78dd6ddcc-2gcnr\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.478648 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.682969 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:01:54 crc kubenswrapper[4743]: W0122 14:01:54.692761 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48994a23_1df2_4f5a_bcbe_3c72f174bdb4.slice/crio-43762812355cafb4e03f550dfeb0b6643ea0a8aed2b9c8722dad9eb10918c31a WatchSource:0}: Error finding container 43762812355cafb4e03f550dfeb0b6643ea0a8aed2b9c8722dad9eb10918c31a: Status 404 returned error can't find the container with id 43762812355cafb4e03f550dfeb0b6643ea0a8aed2b9c8722dad9eb10918c31a Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.697312 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:01:54 crc kubenswrapper[4743]: I0122 14:01:54.753177 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:01:54 crc kubenswrapper[4743]: W0122 14:01:54.764503 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd930d0b8_d30d_40ce_8dfa_677b40600dca.slice/crio-1294af9ac1f7d5cf726a359acab38027f904fd5b384760effa7e91b4dc867a73 WatchSource:0}: Error finding container 1294af9ac1f7d5cf726a359acab38027f904fd5b384760effa7e91b4dc867a73: Status 404 returned error can't find the container with id 1294af9ac1f7d5cf726a359acab38027f904fd5b384760effa7e91b4dc867a73 Jan 22 14:01:55 crc kubenswrapper[4743]: I0122 14:01:55.042405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" event={"ID":"48994a23-1df2-4f5a-bcbe-3c72f174bdb4","Type":"ContainerStarted","Data":"43762812355cafb4e03f550dfeb0b6643ea0a8aed2b9c8722dad9eb10918c31a"} Jan 22 14:01:55 crc kubenswrapper[4743]: I0122 14:01:55.044152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" event={"ID":"d930d0b8-d30d-40ce-8dfa-677b40600dca","Type":"ContainerStarted","Data":"1294af9ac1f7d5cf726a359acab38027f904fd5b384760effa7e91b4dc867a73"} Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.064226 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.090212 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.100537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.104544 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.196401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.196475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtj2\" (UniqueName: \"kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.196727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.298822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.298897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtj2\" (UniqueName: \"kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.298960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.300064 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.300696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.329365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjtj2\" (UniqueName: \"kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2\") pod \"dnsmasq-dns-666b6646f7-2n75z\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.384587 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.402696 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.404460 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.435615 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.453288 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.503087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.503504 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.503556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86qxs\" (UniqueName: \"kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.617190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.617301 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.617348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86qxs\" (UniqueName: \"kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.618149 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.618713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.646145 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86qxs\" (UniqueName: \"kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs\") pod \"dnsmasq-dns-57d769cc4f-9z96h\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:57 crc kubenswrapper[4743]: I0122 14:01:57.730950 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.009752 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:01:58 crc kubenswrapper[4743]: W0122 14:01:58.023697 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d653249_72a8_413a_b835_091258593f30.slice/crio-eabf0e5bb93369f83fcd1fb9ceb480560bad608ad7fb157639f91e4c97183ef1 WatchSource:0}: Error finding container eabf0e5bb93369f83fcd1fb9ceb480560bad608ad7fb157639f91e4c97183ef1: Status 404 returned error can't find the container with id eabf0e5bb93369f83fcd1fb9ceb480560bad608ad7fb157639f91e4c97183ef1 Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.088940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" event={"ID":"6d653249-72a8-413a-b835-091258593f30","Type":"ContainerStarted","Data":"eabf0e5bb93369f83fcd1fb9ceb480560bad608ad7fb157639f91e4c97183ef1"} Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.240978 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.250698 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.252595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.255139 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.261047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.261552 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.261630 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.261828 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.261911 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.262038 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g6sn2" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.262639 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 22 14:01:58 crc kubenswrapper[4743]: W0122 14:01:58.266002 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda598ff96_d072_4440_9fe9_ee99366ccc81.slice/crio-b8592171ba464f3b14911abbbc96798d7d078e77db43a92eec883f0ce11314aa WatchSource:0}: Error finding container b8592171ba464f3b14911abbbc96798d7d078e77db43a92eec883f0ce11314aa: Status 404 returned error can't find the container with id b8592171ba464f3b14911abbbc96798d7d078e77db43a92eec883f0ce11314aa Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88fv\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.434954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.536834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.536944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.536982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r88fv\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537308 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537594 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.537680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.538025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.538147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.538185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.538627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.538633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.544448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.548514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.560906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.564756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.617037 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.623526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88fv\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.629848 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.633503 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.633683 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.633748 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.633546 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.635397 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.635664 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjr2m" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.635433 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.653522 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.662326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.741976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ncr\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.742086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.742222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.742263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.742299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843683 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ncr\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.843989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.844007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.844020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.844044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.844063 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.844562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.846269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.846438 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.846770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.847446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.847636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.847660 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.849284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.852286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.852410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.862216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ncr\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.870651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.891599 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:01:58 crc kubenswrapper[4743]: I0122 14:01:58.961843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.104423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" event={"ID":"a598ff96-d072-4440-9fe9-ee99366ccc81","Type":"ContainerStarted","Data":"b8592171ba464f3b14911abbbc96798d7d078e77db43a92eec883f0ce11314aa"} Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.398845 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.473048 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.714170 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.718203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.720308 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.720494 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.721145 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-s9c4g" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.723470 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.727932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.739103 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867551 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtlgp\" (UniqueName: \"kubernetes.io/projected/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kube-api-access-gtlgp\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.867613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtlgp\" (UniqueName: \"kubernetes.io/projected/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kube-api-access-gtlgp\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968809 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968881 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.968931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.971817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.971987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kolla-config\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.971572 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.972375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-config-data-default\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.974700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.980372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.980430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:01:59 crc kubenswrapper[4743]: I0122 14:01:59.993631 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtlgp\" (UniqueName: \"kubernetes.io/projected/ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1-kube-api-access-gtlgp\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:02:00 crc kubenswrapper[4743]: I0122 14:02:00.011327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1\") " pod="openstack/openstack-galera-0" Jan 22 14:02:00 crc kubenswrapper[4743]: I0122 14:02:00.040179 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 22 14:02:00 crc kubenswrapper[4743]: I0122 14:02:00.139229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerStarted","Data":"f867ff5ebaceb1b356ef718724fea1a3e2ccd7875f3c6678acac9a500fe600d3"} Jan 22 14:02:00 crc kubenswrapper[4743]: I0122 14:02:00.147311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerStarted","Data":"1e2a93d64730a9a5f66c5f9204aa7992a43d89db9b3cbc128490070dd3487ec5"} Jan 22 14:02:00 crc kubenswrapper[4743]: I0122 14:02:00.715279 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.034241 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.037511 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.045029 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.045374 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.045647 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xc64q" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.053518 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.093731 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkp9\" (UniqueName: \"kubernetes.io/projected/2644f1c9-b50c-4666-a099-ddb8912a53ff-kube-api-access-mgkp9\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095543 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.095590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.157285 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.161593 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.166217 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.170467 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.170956 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4hlpn" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.206811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-config-data\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207213 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207917 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.207998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208035 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6nt\" (UniqueName: \"kubernetes.io/projected/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kube-api-access-kb6nt\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kolla-config\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.208972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkp9\" (UniqueName: \"kubernetes.io/projected/2644f1c9-b50c-4666-a099-ddb8912a53ff-kube-api-access-mgkp9\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.209007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.209838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.210065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2644f1c9-b50c-4666-a099-ddb8912a53ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.211636 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.226391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.240578 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2644f1c9-b50c-4666-a099-ddb8912a53ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.241173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2644f1c9-b50c-4666-a099-ddb8912a53ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.243839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkp9\" (UniqueName: \"kubernetes.io/projected/2644f1c9-b50c-4666-a099-ddb8912a53ff-kube-api-access-mgkp9\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.260612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2644f1c9-b50c-4666-a099-ddb8912a53ff\") " pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.312726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.312810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6nt\" (UniqueName: \"kubernetes.io/projected/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kube-api-access-kb6nt\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.312855 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kolla-config\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.312911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.312997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-config-data\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.319654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kolla-config\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.320684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d64b7b-89b2-468c-86e2-fe9de4338c0c-config-data\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.335861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.343691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d64b7b-89b2-468c-86e2-fe9de4338c0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.352156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6nt\" (UniqueName: \"kubernetes.io/projected/63d64b7b-89b2-468c-86e2-fe9de4338c0c-kube-api-access-kb6nt\") pod \"memcached-0\" (UID: \"63d64b7b-89b2-468c-86e2-fe9de4338c0c\") " pod="openstack/memcached-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.380760 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 22 14:02:01 crc kubenswrapper[4743]: I0122 14:02:01.481252 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.584584 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.585886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.587876 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vtn4p" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.608134 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.670243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xrkp\" (UniqueName: \"kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp\") pod \"kube-state-metrics-0\" (UID: \"4bc6739c-92bc-4cee-b3ae-5e178073cf0f\") " pod="openstack/kube-state-metrics-0" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.778854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xrkp\" (UniqueName: \"kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp\") pod \"kube-state-metrics-0\" (UID: \"4bc6739c-92bc-4cee-b3ae-5e178073cf0f\") " pod="openstack/kube-state-metrics-0" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.807411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xrkp\" (UniqueName: \"kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp\") pod \"kube-state-metrics-0\" (UID: \"4bc6739c-92bc-4cee-b3ae-5e178073cf0f\") " pod="openstack/kube-state-metrics-0" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.912775 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vtn4p" Jan 22 14:02:03 crc kubenswrapper[4743]: I0122 14:02:03.921660 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:02:05 crc kubenswrapper[4743]: W0122 14:02:05.525748 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee0fd7f6_d02b_4139_9306_8f9c9a1a8dd1.slice/crio-2aeeb7be0ba405bd5da02d5eb0771f9277e4add6209204443a2651b7d941b383 WatchSource:0}: Error finding container 2aeeb7be0ba405bd5da02d5eb0771f9277e4add6209204443a2651b7d941b383: Status 404 returned error can't find the container with id 2aeeb7be0ba405bd5da02d5eb0771f9277e4add6209204443a2651b7d941b383 Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.219639 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1","Type":"ContainerStarted","Data":"2aeeb7be0ba405bd5da02d5eb0771f9277e4add6209204443a2651b7d941b383"} Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.689953 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m22h5"] Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.690864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.693370 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.693965 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.694115 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nd4vt" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.710714 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5"] Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.791384 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rmfgh"] Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.812963 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.821884 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rmfgh"] Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.830782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-combined-ca-bundle\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.830879 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3551792-b862-492e-8c36-e0a63cd4468f-scripts\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.831025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2g4x\" (UniqueName: \"kubernetes.io/projected/f3551792-b862-492e-8c36-e0a63cd4468f-kube-api-access-k2g4x\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.831060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-log-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.831091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-ovn-controller-tls-certs\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.831119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.831164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-scripts\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-run\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3551792-b862-492e-8c36-e0a63cd4468f-scripts\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-lib\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-etc-ovs\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2g4x\" (UniqueName: \"kubernetes.io/projected/f3551792-b862-492e-8c36-e0a63cd4468f-kube-api-access-k2g4x\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-log-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-ovn-controller-tls-certs\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932600 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs8c4\" (UniqueName: \"kubernetes.io/projected/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-kube-api-access-vs8c4\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-log\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.932706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-combined-ca-bundle\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.934225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.934319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-run-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.934385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3551792-b862-492e-8c36-e0a63cd4468f-var-log-ovn\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.936708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3551792-b862-492e-8c36-e0a63cd4468f-scripts\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.945612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-ovn-controller-tls-certs\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.948449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3551792-b862-492e-8c36-e0a63cd4468f-combined-ca-bundle\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:06 crc kubenswrapper[4743]: I0122 14:02:06.956960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2g4x\" (UniqueName: \"kubernetes.io/projected/f3551792-b862-492e-8c36-e0a63cd4468f-kube-api-access-k2g4x\") pod \"ovn-controller-m22h5\" (UID: \"f3551792-b862-492e-8c36-e0a63cd4468f\") " pod="openstack/ovn-controller-m22h5" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.012525 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-log\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-scripts\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-run\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-lib\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-etc-ovs\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs8c4\" (UniqueName: \"kubernetes.io/projected/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-kube-api-access-vs8c4\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.034983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-log\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.035061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-run\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.035346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-var-lib\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.035936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-etc-ovs\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.038395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-scripts\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.058627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs8c4\" (UniqueName: \"kubernetes.io/projected/60598cb3-9d09-4b83-9b5c-893f5ebf44eb-kube-api-access-vs8c4\") pod \"ovn-controller-ovs-rmfgh\" (UID: \"60598cb3-9d09-4b83-9b5c-893f5ebf44eb\") " pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.131259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.526168 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.528668 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.532483 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.532842 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.533001 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.533163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5bgtx" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.533565 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.555608 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b07a577-785f-4720-919c-ef619448284a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.644612 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9hp\" (UniqueName: \"kubernetes.io/projected/3b07a577-785f-4720-919c-ef619448284a-kube-api-access-wg9hp\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b07a577-785f-4720-919c-ef619448284a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746852 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9hp\" (UniqueName: \"kubernetes.io/projected/3b07a577-785f-4720-919c-ef619448284a-kube-api-access-wg9hp\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.746964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.747207 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.747543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b07a577-785f-4720-919c-ef619448284a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.747556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.748519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b07a577-785f-4720-919c-ef619448284a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.754040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.754665 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.760995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b07a577-785f-4720-919c-ef619448284a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.767338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9hp\" (UniqueName: \"kubernetes.io/projected/3b07a577-785f-4720-919c-ef619448284a-kube-api-access-wg9hp\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.773136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b07a577-785f-4720-919c-ef619448284a\") " pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:07 crc kubenswrapper[4743]: I0122 14:02:07.858903 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 22 14:02:09 crc kubenswrapper[4743]: I0122 14:02:09.836881 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.949969 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.952199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.955147 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.955482 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.956026 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.956105 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fcmjf" Jan 22 14:02:10 crc kubenswrapper[4743]: I0122 14:02:10.960567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.007747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.007873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.007907 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.007943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.008133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.008265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-config\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.008321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrqh\" (UniqueName: \"kubernetes.io/projected/98d7b7d3-f576-4b98-912f-6e7aab2d295a-kube-api-access-ngrqh\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.008357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-config\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrqh\" (UniqueName: \"kubernetes.io/projected/98d7b7d3-f576-4b98-912f-6e7aab2d295a-kube-api-access-ngrqh\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.110703 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.111395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-config\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.111822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.112301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98d7b7d3-f576-4b98-912f-6e7aab2d295a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.116028 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.117077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.117185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d7b7d3-f576-4b98-912f-6e7aab2d295a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.128839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrqh\" (UniqueName: \"kubernetes.io/projected/98d7b7d3-f576-4b98-912f-6e7aab2d295a-kube-api-access-ngrqh\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.134374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"98d7b7d3-f576-4b98-912f-6e7aab2d295a\") " pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:11 crc kubenswrapper[4743]: I0122 14:02:11.274881 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:27 crc kubenswrapper[4743]: I0122 14:02:27.585218 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:02:29 crc kubenswrapper[4743]: E0122 14:02:29.161379 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 22 14:02:29 crc kubenswrapper[4743]: E0122 14:02:29.161838 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtlgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:29 crc kubenswrapper[4743]: E0122 14:02:29.163067 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1" Jan 22 14:02:29 crc kubenswrapper[4743]: I0122 14:02:29.399483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"63d64b7b-89b2-468c-86e2-fe9de4338c0c","Type":"ContainerStarted","Data":"46be03865551f1360f9300e64131b487840f28d00b7a7920d7cd6e184c3df7a9"} Jan 22 14:02:29 crc kubenswrapper[4743]: E0122 14:02:29.401774 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1" Jan 22 14:02:29 crc kubenswrapper[4743]: I0122 14:02:29.537760 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 22 14:02:30 crc kubenswrapper[4743]: W0122 14:02:30.076733 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc6739c_92bc_4cee_b3ae_5e178073cf0f.slice/crio-487f496ebf7e65f179617d35c1c2b6dc13e1ce12dc23046a7640e584cb297499 WatchSource:0}: Error finding container 487f496ebf7e65f179617d35c1c2b6dc13e1ce12dc23046a7640e584cb297499: Status 404 returned error can't find the container with id 487f496ebf7e65f179617d35c1c2b6dc13e1ce12dc23046a7640e584cb297499 Jan 22 14:02:30 crc kubenswrapper[4743]: W0122 14:02:30.088346 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2644f1c9_b50c_4666_a099_ddb8912a53ff.slice/crio-bdb508d2cc07493a25cf7a4aeba39a0959f5dc6ddb2520966ea9dde7532c9fbc WatchSource:0}: Error finding container bdb508d2cc07493a25cf7a4aeba39a0959f5dc6ddb2520966ea9dde7532c9fbc: Status 404 returned error can't find the container with id bdb508d2cc07493a25cf7a4aeba39a0959f5dc6ddb2520966ea9dde7532c9fbc Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.258155 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.258961 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86qxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-9z96h_openstack(a598ff96-d072-4440-9fe9-ee99366ccc81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.261781 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.316807 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.316996 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwgmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-2gcnr_openstack(d930d0b8-d30d-40ce-8dfa-677b40600dca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.319073 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" podUID="d930d0b8-d30d-40ce-8dfa-677b40600dca" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.400492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5"] Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.409937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4bc6739c-92bc-4cee-b3ae-5e178073cf0f","Type":"ContainerStarted","Data":"487f496ebf7e65f179617d35c1c2b6dc13e1ce12dc23046a7640e584cb297499"} Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.413639 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2644f1c9-b50c-4666-a099-ddb8912a53ff","Type":"ContainerStarted","Data":"bdb508d2cc07493a25cf7a4aeba39a0959f5dc6ddb2520966ea9dde7532c9fbc"} Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.424600 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.521045 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.521240 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j8tz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-mxxkp_openstack(48994a23-1df2-4f5a-bcbe-3c72f174bdb4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:30 crc kubenswrapper[4743]: E0122 14:02:30.522475 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" podUID="48994a23-1df2-4f5a-bcbe-3c72f174bdb4" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.563168 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 22 14:02:30 crc kubenswrapper[4743]: W0122 14:02:30.566120 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b07a577_785f_4720_919c_ef619448284a.slice/crio-405afb49034a4bd266d856559b1a4aada223d0c7b8dbb34947338d8ce7d82307 WatchSource:0}: Error finding container 405afb49034a4bd266d856559b1a4aada223d0c7b8dbb34947338d8ce7d82307: Status 404 returned error can't find the container with id 405afb49034a4bd266d856559b1a4aada223d0c7b8dbb34947338d8ce7d82307 Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.692244 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.776150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.788059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwgmc\" (UniqueName: \"kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc\") pod \"d930d0b8-d30d-40ce-8dfa-677b40600dca\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.788212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc\") pod \"d930d0b8-d30d-40ce-8dfa-677b40600dca\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.788262 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config\") pod \"d930d0b8-d30d-40ce-8dfa-677b40600dca\" (UID: \"d930d0b8-d30d-40ce-8dfa-677b40600dca\") " Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.788954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d930d0b8-d30d-40ce-8dfa-677b40600dca" (UID: "d930d0b8-d30d-40ce-8dfa-677b40600dca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.789042 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config" (OuterVolumeSpecName: "config") pod "d930d0b8-d30d-40ce-8dfa-677b40600dca" (UID: "d930d0b8-d30d-40ce-8dfa-677b40600dca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.871900 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc" (OuterVolumeSpecName: "kube-api-access-pwgmc") pod "d930d0b8-d30d-40ce-8dfa-677b40600dca" (UID: "d930d0b8-d30d-40ce-8dfa-677b40600dca"). InnerVolumeSpecName "kube-api-access-pwgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:02:30 crc kubenswrapper[4743]: W0122 14:02:30.882719 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98d7b7d3_f576_4b98_912f_6e7aab2d295a.slice/crio-565349753a2cffde20fa22155066fef50766716e4fd8731267fe9df0400a733b WatchSource:0}: Error finding container 565349753a2cffde20fa22155066fef50766716e4fd8731267fe9df0400a733b: Status 404 returned error can't find the container with id 565349753a2cffde20fa22155066fef50766716e4fd8731267fe9df0400a733b Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.891542 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.891578 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwgmc\" (UniqueName: \"kubernetes.io/projected/d930d0b8-d30d-40ce-8dfa-677b40600dca-kube-api-access-pwgmc\") on node \"crc\" DevicePath \"\"" Jan 22 14:02:30 crc kubenswrapper[4743]: I0122 14:02:30.891592 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d930d0b8-d30d-40ce-8dfa-677b40600dca-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:02:31 crc kubenswrapper[4743]: E0122 14:02:31.176198 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 22 14:02:31 crc kubenswrapper[4743]: E0122 14:02:31.176614 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjtj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-2n75z_openstack(6d653249-72a8-413a-b835-091258593f30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:31 crc kubenswrapper[4743]: E0122 14:02:31.178463 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" podUID="6d653249-72a8-413a-b835-091258593f30" Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.424052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"98d7b7d3-f576-4b98-912f-6e7aab2d295a","Type":"ContainerStarted","Data":"565349753a2cffde20fa22155066fef50766716e4fd8731267fe9df0400a733b"} Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.425895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b07a577-785f-4720-919c-ef619448284a","Type":"ContainerStarted","Data":"405afb49034a4bd266d856559b1a4aada223d0c7b8dbb34947338d8ce7d82307"} Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.427648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" event={"ID":"d930d0b8-d30d-40ce-8dfa-677b40600dca","Type":"ContainerDied","Data":"1294af9ac1f7d5cf726a359acab38027f904fd5b384760effa7e91b4dc867a73"} Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.427687 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2gcnr" Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.428647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5" event={"ID":"f3551792-b862-492e-8c36-e0a63cd4468f","Type":"ContainerStarted","Data":"6b7e48e9ff2a5fa783dfb822d2e805b1436262355c9d8fe3c56634ef5a1a207a"} Jan 22 14:02:31 crc kubenswrapper[4743]: E0122 14:02:31.439272 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" podUID="6d653249-72a8-413a-b835-091258593f30" Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.591350 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rmfgh"] Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.672242 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.679266 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2gcnr"] Jan 22 14:02:31 crc kubenswrapper[4743]: I0122 14:02:31.757065 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d930d0b8-d30d-40ce-8dfa-677b40600dca" path="/var/lib/kubelet/pods/d930d0b8-d30d-40ce-8dfa-677b40600dca/volumes" Jan 22 14:02:32 crc kubenswrapper[4743]: I0122 14:02:32.451972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerStarted","Data":"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b"} Jan 22 14:02:32 crc kubenswrapper[4743]: I0122 14:02:32.456583 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2644f1c9-b50c-4666-a099-ddb8912a53ff","Type":"ContainerStarted","Data":"0e89d445fc52ec24835bb2d186984fc61b3966f82b2fffbf064bc72884e5d5c1"} Jan 22 14:02:32 crc kubenswrapper[4743]: I0122 14:02:32.468906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerStarted","Data":"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8"} Jan 22 14:02:32 crc kubenswrapper[4743]: I0122 14:02:32.470165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rmfgh" event={"ID":"60598cb3-9d09-4b83-9b5c-893f5ebf44eb","Type":"ContainerStarted","Data":"2a027307da75f5d73da33eab1d7db8fe9df44f7d50135d86ab004b2093275147"} Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.365575 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.435604 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config\") pod \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.435700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8tz2\" (UniqueName: \"kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2\") pod \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\" (UID: \"48994a23-1df2-4f5a-bcbe-3c72f174bdb4\") " Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.436521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config" (OuterVolumeSpecName: "config") pod "48994a23-1df2-4f5a-bcbe-3c72f174bdb4" (UID: "48994a23-1df2-4f5a-bcbe-3c72f174bdb4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.446199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2" (OuterVolumeSpecName: "kube-api-access-j8tz2") pod "48994a23-1df2-4f5a-bcbe-3c72f174bdb4" (UID: "48994a23-1df2-4f5a-bcbe-3c72f174bdb4"). InnerVolumeSpecName "kube-api-access-j8tz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.484151 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.489935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mxxkp" event={"ID":"48994a23-1df2-4f5a-bcbe-3c72f174bdb4","Type":"ContainerDied","Data":"43762812355cafb4e03f550dfeb0b6643ea0a8aed2b9c8722dad9eb10918c31a"} Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.542118 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.542144 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8tz2\" (UniqueName: \"kubernetes.io/projected/48994a23-1df2-4f5a-bcbe-3c72f174bdb4-kube-api-access-j8tz2\") on node \"crc\" DevicePath \"\"" Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.556991 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.567974 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mxxkp"] Jan 22 14:02:33 crc kubenswrapper[4743]: I0122 14:02:33.761640 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48994a23-1df2-4f5a-bcbe-3c72f174bdb4" path="/var/lib/kubelet/pods/48994a23-1df2-4f5a-bcbe-3c72f174bdb4/volumes" Jan 22 14:02:44 crc kubenswrapper[4743]: E0122 14:02:44.696426 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Jan 22 14:02:44 crc kubenswrapper[4743]: E0122 14:02:44.697180 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb4h54fh596h574h698h5bdh5cdh5d5h5dbhf9hcch5fch74h658h55ch547hddh67fh578h5bdhb9h66fh8fh76h5c6hf9h594h5d8h5d4h5cfh59dh595q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg9hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(3b07a577-785f-4720-919c-ef619448284a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:02:52 crc kubenswrapper[4743]: I0122 14:02:52.618970 4743 generic.go:334] "Generic (PLEG): container finished" podID="2644f1c9-b50c-4666-a099-ddb8912a53ff" containerID="0e89d445fc52ec24835bb2d186984fc61b3966f82b2fffbf064bc72884e5d5c1" exitCode=0 Jan 22 14:02:52 crc kubenswrapper[4743]: I0122 14:02:52.619061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2644f1c9-b50c-4666-a099-ddb8912a53ff","Type":"ContainerDied","Data":"0e89d445fc52ec24835bb2d186984fc61b3966f82b2fffbf064bc72884e5d5c1"} Jan 22 14:02:52 crc kubenswrapper[4743]: I0122 14:02:52.621732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1","Type":"ContainerStarted","Data":"db7ec0fee59341b3c825dd036083c9b20d700e829f67984f8f22d9c195cbb5b8"} Jan 22 14:02:53 crc kubenswrapper[4743]: E0122 14:02:53.163669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="3b07a577-785f-4720-919c-ef619448284a" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.630935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5" event={"ID":"f3551792-b862-492e-8c36-e0a63cd4468f","Type":"ContainerStarted","Data":"cbf2a3169c2e32e3fc727f2bd3e4163cf2a19ee7a91a7b947114a793445abf34"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.632051 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m22h5" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.632359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"63d64b7b-89b2-468c-86e2-fe9de4338c0c","Type":"ContainerStarted","Data":"0546c95c0da86e0b1bcd0ee01ec58089a13223e3f74b1ba0927f79cc5ffe83fd"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.634859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"98d7b7d3-f576-4b98-912f-6e7aab2d295a","Type":"ContainerStarted","Data":"52393a94bd1581d7a8b9e6f069ac9cc01053503339e5f963e2ceec3c832049e2"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.634925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"98d7b7d3-f576-4b98-912f-6e7aab2d295a","Type":"ContainerStarted","Data":"3848736fa3bcac8785a569449e6e1d8e730cc5264c90ca5e1d31c9a3ef786fb4"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.636847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b07a577-785f-4720-919c-ef619448284a","Type":"ContainerStarted","Data":"5c2fa3f8d463f3f5bf837760e6eb78b8a755ce1510bf933282b3c8c084824a6c"} Jan 22 14:02:53 crc kubenswrapper[4743]: E0122 14:02:53.638954 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="3b07a577-785f-4720-919c-ef619448284a" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.639480 4743 generic.go:334] "Generic (PLEG): container finished" podID="60598cb3-9d09-4b83-9b5c-893f5ebf44eb" containerID="9dd51085214dc5df141f6d923c5d9b98282bf33d62be041f1ddd7b274aebf57f" exitCode=0 Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.639511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rmfgh" event={"ID":"60598cb3-9d09-4b83-9b5c-893f5ebf44eb","Type":"ContainerDied","Data":"9dd51085214dc5df141f6d923c5d9b98282bf33d62be041f1ddd7b274aebf57f"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.644160 4743 generic.go:334] "Generic (PLEG): container finished" podID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerID="ec04004ad847a5553ef8f1383480d80970d47fb857c214b72593295e6771f653" exitCode=0 Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.644225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" event={"ID":"a598ff96-d072-4440-9fe9-ee99366ccc81","Type":"ContainerDied","Data":"ec04004ad847a5553ef8f1383480d80970d47fb857c214b72593295e6771f653"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.646193 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d653249-72a8-413a-b835-091258593f30" containerID="b0d86d09e5373b6c1236ad84ddeb400ccbb223851fa917648e90b7fa6ed84d89" exitCode=0 Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.646240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" event={"ID":"6d653249-72a8-413a-b835-091258593f30","Type":"ContainerDied","Data":"b0d86d09e5373b6c1236ad84ddeb400ccbb223851fa917648e90b7fa6ed84d89"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.648263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4bc6739c-92bc-4cee-b3ae-5e178073cf0f","Type":"ContainerStarted","Data":"8a9277a97a83e9bcc0250a45662bd79e2e3456b6000758c0cfd4824f6e60342a"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.648558 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.650603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2644f1c9-b50c-4666-a099-ddb8912a53ff","Type":"ContainerStarted","Data":"76d3f01d65bfba061998fad31097feab5744ca147d9cb7730ffc60aff88f3ca6"} Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.658980 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m22h5" podStartSLOduration=27.27241548 podStartE2EDuration="47.658956281s" podCreationTimestamp="2026-01-22 14:02:06 +0000 UTC" firstStartedPulling="2026-01-22 14:02:30.408598759 +0000 UTC m=+986.963641922" lastFinishedPulling="2026-01-22 14:02:50.79513955 +0000 UTC m=+1007.350182723" observedRunningTime="2026-01-22 14:02:53.651915231 +0000 UTC m=+1010.206958414" watchObservedRunningTime="2026-01-22 14:02:53.658956281 +0000 UTC m=+1010.213999444" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.682451 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=35.264405705 podStartE2EDuration="52.682428596s" podCreationTimestamp="2026-01-22 14:02:01 +0000 UTC" firstStartedPulling="2026-01-22 14:02:29.127370921 +0000 UTC m=+985.682414104" lastFinishedPulling="2026-01-22 14:02:46.545393832 +0000 UTC m=+1003.100436995" observedRunningTime="2026-01-22 14:02:53.680281198 +0000 UTC m=+1010.235324361" watchObservedRunningTime="2026-01-22 14:02:53.682428596 +0000 UTC m=+1010.237471759" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.750706 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.455673322 podStartE2EDuration="44.75068542s" podCreationTimestamp="2026-01-22 14:02:09 +0000 UTC" firstStartedPulling="2026-01-22 14:02:30.887061948 +0000 UTC m=+987.442105111" lastFinishedPulling="2026-01-22 14:02:51.182074036 +0000 UTC m=+1007.737117209" observedRunningTime="2026-01-22 14:02:53.730446953 +0000 UTC m=+1010.285490116" watchObservedRunningTime="2026-01-22 14:02:53.75068542 +0000 UTC m=+1010.305728573" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.771807 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=52.428253717 podStartE2EDuration="53.77177585s" podCreationTimestamp="2026-01-22 14:02:00 +0000 UTC" firstStartedPulling="2026-01-22 14:02:30.097704109 +0000 UTC m=+986.652747312" lastFinishedPulling="2026-01-22 14:02:31.441226282 +0000 UTC m=+987.996269445" observedRunningTime="2026-01-22 14:02:53.756162548 +0000 UTC m=+1010.311205711" watchObservedRunningTime="2026-01-22 14:02:53.77177585 +0000 UTC m=+1010.326819013" Jan 22 14:02:53 crc kubenswrapper[4743]: I0122 14:02:53.829665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=28.214574807 podStartE2EDuration="50.829642513s" podCreationTimestamp="2026-01-22 14:02:03 +0000 UTC" firstStartedPulling="2026-01-22 14:02:30.079818826 +0000 UTC m=+986.634861999" lastFinishedPulling="2026-01-22 14:02:52.694886542 +0000 UTC m=+1009.249929705" observedRunningTime="2026-01-22 14:02:53.826947081 +0000 UTC m=+1010.381990264" watchObservedRunningTime="2026-01-22 14:02:53.829642513 +0000 UTC m=+1010.384685676" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.658359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rmfgh" event={"ID":"60598cb3-9d09-4b83-9b5c-893f5ebf44eb","Type":"ContainerStarted","Data":"70b1f23b1d8048756421fbd45ec610e2142dd0483e828aa12854aea616330715"} Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.658639 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.658651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rmfgh" event={"ID":"60598cb3-9d09-4b83-9b5c-893f5ebf44eb","Type":"ContainerStarted","Data":"6380772fd2e444b91f61cac58aaf069a3d2fb14212c1b0cff8035a4a5f8481bf"} Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.659585 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.660540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" event={"ID":"a598ff96-d072-4440-9fe9-ee99366ccc81","Type":"ContainerStarted","Data":"e1b1fd4b3db27067c4dccebfd8a91b37bf7a9efa2f91b05418fdcf5532a257e7"} Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.661580 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.663565 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" event={"ID":"6d653249-72a8-413a-b835-091258593f30","Type":"ContainerStarted","Data":"807f6f5a806c84f3cb5f8ecdd90c1fdfbaac9fd97e626baec9ddc502ae8ab090"} Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.664893 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 22 14:02:54 crc kubenswrapper[4743]: E0122 14:02:54.665001 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="3b07a577-785f-4720-919c-ef619448284a" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.683539 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rmfgh" podStartSLOduration=33.564625166 podStartE2EDuration="48.683515825s" podCreationTimestamp="2026-01-22 14:02:06 +0000 UTC" firstStartedPulling="2026-01-22 14:02:31.591828251 +0000 UTC m=+988.146871414" lastFinishedPulling="2026-01-22 14:02:46.71071891 +0000 UTC m=+1003.265762073" observedRunningTime="2026-01-22 14:02:54.678039528 +0000 UTC m=+1011.233082701" watchObservedRunningTime="2026-01-22 14:02:54.683515825 +0000 UTC m=+1011.238558988" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.727193 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" podStartSLOduration=3.321963088 podStartE2EDuration="57.727173395s" podCreationTimestamp="2026-01-22 14:01:57 +0000 UTC" firstStartedPulling="2026-01-22 14:01:58.281524114 +0000 UTC m=+954.836567277" lastFinishedPulling="2026-01-22 14:02:52.686734421 +0000 UTC m=+1009.241777584" observedRunningTime="2026-01-22 14:02:54.719153728 +0000 UTC m=+1011.274196911" watchObservedRunningTime="2026-01-22 14:02:54.727173395 +0000 UTC m=+1011.282216558" Jan 22 14:02:54 crc kubenswrapper[4743]: I0122 14:02:54.737644 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" podStartSLOduration=2.9869564520000003 podStartE2EDuration="57.737618207s" podCreationTimestamp="2026-01-22 14:01:57 +0000 UTC" firstStartedPulling="2026-01-22 14:01:58.026943632 +0000 UTC m=+954.581986795" lastFinishedPulling="2026-01-22 14:02:52.777605387 +0000 UTC m=+1009.332648550" observedRunningTime="2026-01-22 14:02:54.736625251 +0000 UTC m=+1011.291668414" watchObservedRunningTime="2026-01-22 14:02:54.737618207 +0000 UTC m=+1011.292661370" Jan 22 14:02:56 crc kubenswrapper[4743]: I0122 14:02:56.275254 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:56 crc kubenswrapper[4743]: I0122 14:02:56.276400 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:56 crc kubenswrapper[4743]: I0122 14:02:56.312332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:56 crc kubenswrapper[4743]: I0122 14:02:56.677701 4743 generic.go:334] "Generic (PLEG): container finished" podID="ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1" containerID="db7ec0fee59341b3c825dd036083c9b20d700e829f67984f8f22d9c195cbb5b8" exitCode=0 Jan 22 14:02:56 crc kubenswrapper[4743]: I0122 14:02:56.677813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1","Type":"ContainerDied","Data":"db7ec0fee59341b3c825dd036083c9b20d700e829f67984f8f22d9c195cbb5b8"} Jan 22 14:02:57 crc kubenswrapper[4743]: I0122 14:02:57.454033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:02:57 crc kubenswrapper[4743]: I0122 14:02:57.738066 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 22 14:02:57 crc kubenswrapper[4743]: I0122 14:02:57.997958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:02:57 crc kubenswrapper[4743]: I0122 14:02:57.998277 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="dnsmasq-dns" containerID="cri-o://e1b1fd4b3db27067c4dccebfd8a91b37bf7a9efa2f91b05418fdcf5532a257e7" gracePeriod=10 Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.028401 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.029730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.034860 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.049295 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.112957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.113032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.113276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45n5\" (UniqueName: \"kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.113347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.139468 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rzsrt"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.140651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.142839 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.151590 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzsrt"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.215477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.215541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-config\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.215627 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.216783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.216931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.217758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.218905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45n5\" (UniqueName: \"kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.218946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.218999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-combined-ca-bundle\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.219038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovs-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.219061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmv48\" (UniqueName: \"kubernetes.io/projected/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-kube-api-access-tmv48\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.219110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovn-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.220188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.242336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45n5\" (UniqueName: \"kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5\") pod \"dnsmasq-dns-7f896c8c65-c8xvw\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovn-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-config\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-combined-ca-bundle\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320734 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovs-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.320760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmv48\" (UniqueName: \"kubernetes.io/projected/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-kube-api-access-tmv48\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.321423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovn-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.322097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-ovs-rundir\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.322714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-config\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.325089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-combined-ca-bundle\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.327463 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.331732 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.331993 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="dnsmasq-dns" containerID="cri-o://807f6f5a806c84f3cb5f8ecdd90c1fdfbaac9fd97e626baec9ddc502ae8ab090" gracePeriod=10 Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.333554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.343087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmv48\" (UniqueName: \"kubernetes.io/projected/3450abf2-6cd6-4090-b26f-4d83e2a6ea2b-kube-api-access-tmv48\") pod \"ovn-controller-metrics-rzsrt\" (UID: \"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b\") " pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.343266 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.389926 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.391776 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.394025 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.394160 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.464008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzsrt" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.533070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.533149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.533214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.533236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.533363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkxr\" (UniqueName: \"kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.634763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.634832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.634857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.634877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.634921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkxr\" (UniqueName: \"kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.636343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.637194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.637696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.638193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.656370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkxr\" (UniqueName: \"kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr\") pod \"dnsmasq-dns-86db49b7ff-4jnjh\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.735822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.846357 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:02:58 crc kubenswrapper[4743]: W0122 14:02:58.853120 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf57027_9e60_4f15_a193_f5278a8cb568.slice/crio-0ed3361b56a1657f9c17f2390205c6d8e73eeb5d47aefd2cbba3fb63cf93557f WatchSource:0}: Error finding container 0ed3361b56a1657f9c17f2390205c6d8e73eeb5d47aefd2cbba3fb63cf93557f: Status 404 returned error can't find the container with id 0ed3361b56a1657f9c17f2390205c6d8e73eeb5d47aefd2cbba3fb63cf93557f Jan 22 14:02:58 crc kubenswrapper[4743]: I0122 14:02:58.938211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzsrt"] Jan 22 14:02:59 crc kubenswrapper[4743]: I0122 14:02:59.139767 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:02:59 crc kubenswrapper[4743]: W0122 14:02:59.147285 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd315b7b_42c7_4482_b739_5b003bf02430.slice/crio-5a48bc21465d71d0046f3135194b17a064565b561a52002dd1af522e2d411487 WatchSource:0}: Error finding container 5a48bc21465d71d0046f3135194b17a064565b561a52002dd1af522e2d411487: Status 404 returned error can't find the container with id 5a48bc21465d71d0046f3135194b17a064565b561a52002dd1af522e2d411487 Jan 22 14:02:59 crc kubenswrapper[4743]: I0122 14:02:59.699983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzsrt" event={"ID":"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b","Type":"ContainerStarted","Data":"d59963d6eb3daf05b9df12be5f67f2d1ab30a06becb3e6b22134f738d5dc865f"} Jan 22 14:02:59 crc kubenswrapper[4743]: I0122 14:02:59.702958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" event={"ID":"cd315b7b-42c7-4482-b739-5b003bf02430","Type":"ContainerStarted","Data":"5a48bc21465d71d0046f3135194b17a064565b561a52002dd1af522e2d411487"} Jan 22 14:02:59 crc kubenswrapper[4743]: I0122 14:02:59.704813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" event={"ID":"3bf57027-9e60-4f15-a193-f5278a8cb568","Type":"ContainerStarted","Data":"0ed3361b56a1657f9c17f2390205c6d8e73eeb5d47aefd2cbba3fb63cf93557f"} Jan 22 14:03:01 crc kubenswrapper[4743]: I0122 14:03:01.381390 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 22 14:03:01 crc kubenswrapper[4743]: I0122 14:03:01.381778 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 22 14:03:01 crc kubenswrapper[4743]: I0122 14:03:01.483161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.454900 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.96:5353: connect: connection refused" Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.732321 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.743301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1","Type":"ContainerStarted","Data":"ca6dc08bc3b06c485e5c961e1674da3ba2b95e7c409b8916b6ea2e9d31671274"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.747084 4743 generic.go:334] "Generic (PLEG): container finished" podID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerID="27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21" exitCode=0 Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.747170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" event={"ID":"3bf57027-9e60-4f15-a193-f5278a8cb568","Type":"ContainerDied","Data":"27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.755855 4743 generic.go:334] "Generic (PLEG): container finished" podID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerID="e1b1fd4b3db27067c4dccebfd8a91b37bf7a9efa2f91b05418fdcf5532a257e7" exitCode=0 Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.755905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" event={"ID":"a598ff96-d072-4440-9fe9-ee99366ccc81","Type":"ContainerDied","Data":"e1b1fd4b3db27067c4dccebfd8a91b37bf7a9efa2f91b05418fdcf5532a257e7"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.759026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzsrt" event={"ID":"3450abf2-6cd6-4090-b26f-4d83e2a6ea2b","Type":"ContainerStarted","Data":"9f512d59e20cf9e453caa09b58ee0b8727af427562d6c0ee8aa99c107a14a194"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.764563 4743 generic.go:334] "Generic (PLEG): container finished" podID="cd315b7b-42c7-4482-b739-5b003bf02430" containerID="51c01670deecbf0586c0c1c9cfbd272866f29288682883c2ba2dcd476ac9cc0a" exitCode=0 Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.764638 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" event={"ID":"cd315b7b-42c7-4482-b739-5b003bf02430","Type":"ContainerDied","Data":"51c01670deecbf0586c0c1c9cfbd272866f29288682883c2ba2dcd476ac9cc0a"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.774165 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d653249-72a8-413a-b835-091258593f30" containerID="807f6f5a806c84f3cb5f8ecdd90c1fdfbaac9fd97e626baec9ddc502ae8ab090" exitCode=0 Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.774217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" event={"ID":"6d653249-72a8-413a-b835-091258593f30","Type":"ContainerDied","Data":"807f6f5a806c84f3cb5f8ecdd90c1fdfbaac9fd97e626baec9ddc502ae8ab090"} Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.776170 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371972.078629 podStartE2EDuration="1m4.77614816s" podCreationTimestamp="2026-01-22 14:01:58 +0000 UTC" firstStartedPulling="2026-01-22 14:02:05.528898424 +0000 UTC m=+962.083941587" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:02.768491473 +0000 UTC m=+1019.323534646" watchObservedRunningTime="2026-01-22 14:03:02.77614816 +0000 UTC m=+1019.331191313" Jan 22 14:03:02 crc kubenswrapper[4743]: I0122 14:03:02.825461 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rzsrt" podStartSLOduration=4.825430192 podStartE2EDuration="4.825430192s" podCreationTimestamp="2026-01-22 14:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:02.817698853 +0000 UTC m=+1019.372742026" watchObservedRunningTime="2026-01-22 14:03:02.825430192 +0000 UTC m=+1019.380473355" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.086394 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.157128 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.157808 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.217832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config\") pod \"a598ff96-d072-4440-9fe9-ee99366ccc81\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.217893 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjtj2\" (UniqueName: \"kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2\") pod \"6d653249-72a8-413a-b835-091258593f30\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.217918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86qxs\" (UniqueName: \"kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs\") pod \"a598ff96-d072-4440-9fe9-ee99366ccc81\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.217972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc\") pod \"a598ff96-d072-4440-9fe9-ee99366ccc81\" (UID: \"a598ff96-d072-4440-9fe9-ee99366ccc81\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.218007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config\") pod \"6d653249-72a8-413a-b835-091258593f30\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.218036 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc\") pod \"6d653249-72a8-413a-b835-091258593f30\" (UID: \"6d653249-72a8-413a-b835-091258593f30\") " Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.223775 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2" (OuterVolumeSpecName: "kube-api-access-cjtj2") pod "6d653249-72a8-413a-b835-091258593f30" (UID: "6d653249-72a8-413a-b835-091258593f30"). InnerVolumeSpecName "kube-api-access-cjtj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.226944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs" (OuterVolumeSpecName: "kube-api-access-86qxs") pod "a598ff96-d072-4440-9fe9-ee99366ccc81" (UID: "a598ff96-d072-4440-9fe9-ee99366ccc81"). InnerVolumeSpecName "kube-api-access-86qxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.230148 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.268088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config" (OuterVolumeSpecName: "config") pod "a598ff96-d072-4440-9fe9-ee99366ccc81" (UID: "a598ff96-d072-4440-9fe9-ee99366ccc81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.268493 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a598ff96-d072-4440-9fe9-ee99366ccc81" (UID: "a598ff96-d072-4440-9fe9-ee99366ccc81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.269323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config" (OuterVolumeSpecName: "config") pod "6d653249-72a8-413a-b835-091258593f30" (UID: "6d653249-72a8-413a-b835-091258593f30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.277241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d653249-72a8-413a-b835-091258593f30" (UID: "6d653249-72a8-413a-b835-091258593f30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319370 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319423 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjtj2\" (UniqueName: \"kubernetes.io/projected/6d653249-72a8-413a-b835-091258593f30-kube-api-access-cjtj2\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86qxs\" (UniqueName: \"kubernetes.io/projected/a598ff96-d072-4440-9fe9-ee99366ccc81-kube-api-access-86qxs\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319453 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a598ff96-d072-4440-9fe9-ee99366ccc81-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319464 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.319474 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d653249-72a8-413a-b835-091258593f30-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.783181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" event={"ID":"3bf57027-9e60-4f15-a193-f5278a8cb568","Type":"ContainerStarted","Data":"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196"} Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.783293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.785117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" event={"ID":"a598ff96-d072-4440-9fe9-ee99366ccc81","Type":"ContainerDied","Data":"b8592171ba464f3b14911abbbc96798d7d078e77db43a92eec883f0ce11314aa"} Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.785144 4743 scope.go:117] "RemoveContainer" containerID="e1b1fd4b3db27067c4dccebfd8a91b37bf7a9efa2f91b05418fdcf5532a257e7" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.785247 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z96h" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.789159 4743 generic.go:334] "Generic (PLEG): container finished" podID="a474b98d-9569-40f4-a3d2-f4017988678b" containerID="25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b" exitCode=0 Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.789210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerDied","Data":"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b"} Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.794390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" event={"ID":"cd315b7b-42c7-4482-b739-5b003bf02430","Type":"ContainerStarted","Data":"d1a9a9135ea3be885bb0b3b6f71c51751367a01b996f7ba1d642c273cb78ef95"} Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.794689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.796436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" event={"ID":"6d653249-72a8-413a-b835-091258593f30","Type":"ContainerDied","Data":"eabf0e5bb93369f83fcd1fb9ceb480560bad608ad7fb157639f91e4c97183ef1"} Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.796583 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2n75z" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.812552 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" podStartSLOduration=5.812532114 podStartE2EDuration="5.812532114s" podCreationTimestamp="2026-01-22 14:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:03.798863425 +0000 UTC m=+1020.353906598" watchObservedRunningTime="2026-01-22 14:03:03.812532114 +0000 UTC m=+1020.367575267" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.870462 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" podStartSLOduration=5.870439489 podStartE2EDuration="5.870439489s" podCreationTimestamp="2026-01-22 14:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:03.851339563 +0000 UTC m=+1020.406382736" watchObservedRunningTime="2026-01-22 14:03:03.870439489 +0000 UTC m=+1020.425482662" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.962989 4743 scope.go:117] "RemoveContainer" containerID="ec04004ad847a5553ef8f1383480d80970d47fb857c214b72593295e6771f653" Jan 22 14:03:03 crc kubenswrapper[4743]: I0122 14:03:03.997110 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.015693 4743 scope.go:117] "RemoveContainer" containerID="807f6f5a806c84f3cb5f8ecdd90c1fdfbaac9fd97e626baec9ddc502ae8ab090" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.041867 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.056367 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z96h"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.073623 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.086001 4743 scope.go:117] "RemoveContainer" containerID="b0d86d09e5373b6c1236ad84ddeb400ccbb223851fa917648e90b7fa6ed84d89" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.095484 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2n75z"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.123020 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.187760 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:03:04 crc kubenswrapper[4743]: E0122 14:03:04.188221 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: E0122 14:03:04.188303 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188313 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: E0122 14:03:04.188332 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="init" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188340 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="init" Jan 22 14:03:04 crc kubenswrapper[4743]: E0122 14:03:04.188370 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="init" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188378 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="init" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188558 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.188588 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d653249-72a8-413a-b835-091258593f30" containerName="dnsmasq-dns" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.191660 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.266933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.345639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.345680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.345723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmzsm\" (UniqueName: \"kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.345745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.345872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.447095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.447232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.447255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.447281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmzsm\" (UniqueName: \"kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.447302 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.448247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.448372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.448474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.451180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.465920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmzsm\" (UniqueName: \"kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm\") pod \"dnsmasq-dns-698758b865-kvm68\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.559207 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.805731 4743 generic.go:334] "Generic (PLEG): container finished" podID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerID="1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8" exitCode=0 Jan 22 14:03:04 crc kubenswrapper[4743]: I0122 14:03:04.806906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerDied","Data":"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8"} Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.068603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:03:05 crc kubenswrapper[4743]: W0122 14:03:05.072242 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b9f413_5078_4cb7_9515_a4d1b2d4bcbe.slice/crio-0a104f73fd0489f44f500be7d4981a8eb5712541760c1262764ed6d49887d0ca WatchSource:0}: Error finding container 0a104f73fd0489f44f500be7d4981a8eb5712541760c1262764ed6d49887d0ca: Status 404 returned error can't find the container with id 0a104f73fd0489f44f500be7d4981a8eb5712541760c1262764ed6d49887d0ca Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.214298 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.220402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.223154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.223613 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.224074 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.224133 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gc55h" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.239406 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-lock\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338e196f-7c64-4cbd-b058-768ccb4c5df9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfnm\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-kube-api-access-fpfnm\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-cache\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.264638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-lock\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338e196f-7c64-4cbd-b058-768ccb4c5df9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfnm\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-kube-api-access-fpfnm\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.366716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-cache\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.367277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-cache\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.367409 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.367457 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.367535 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:05.867507171 +0000 UTC m=+1022.422550324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.367531 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.368085 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/338e196f-7c64-4cbd-b058-768ccb4c5df9-lock\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.373594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338e196f-7c64-4cbd-b058-768ccb4c5df9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.385575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfnm\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-kube-api-access-fpfnm\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.391636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.756494 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d653249-72a8-413a-b835-091258593f30" path="/var/lib/kubelet/pods/6d653249-72a8-413a-b835-091258593f30/volumes" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.757460 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a598ff96-d072-4440-9fe9-ee99366ccc81" path="/var/lib/kubelet/pods/a598ff96-d072-4440-9fe9-ee99366ccc81/volumes" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.815910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerStarted","Data":"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40"} Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.817143 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.818125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerStarted","Data":"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850"} Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.818344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.819900 4743 generic.go:334] "Generic (PLEG): container finished" podID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerID="f1deba13d96d7c69a7d9bc6b56ac1f77743f928878db2848b1fabff6044bb634" exitCode=0 Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.820381 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="dnsmasq-dns" containerID="cri-o://58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196" gracePeriod=10 Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.821586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kvm68" event={"ID":"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe","Type":"ContainerDied","Data":"f1deba13d96d7c69a7d9bc6b56ac1f77743f928878db2848b1fabff6044bb634"} Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.821658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kvm68" event={"ID":"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe","Type":"ContainerStarted","Data":"0a104f73fd0489f44f500be7d4981a8eb5712541760c1262764ed6d49887d0ca"} Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.848275 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.231065763 podStartE2EDuration="1m8.848239089s" podCreationTimestamp="2026-01-22 14:01:57 +0000 UTC" firstStartedPulling="2026-01-22 14:01:59.488377855 +0000 UTC m=+956.043421018" lastFinishedPulling="2026-01-22 14:02:30.105551181 +0000 UTC m=+986.660594344" observedRunningTime="2026-01-22 14:03:05.836333458 +0000 UTC m=+1022.391376641" watchObservedRunningTime="2026-01-22 14:03:05.848239089 +0000 UTC m=+1022.403282252" Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.877011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.879165 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.879191 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: E0122 14:03:05.879240 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:06.879218176 +0000 UTC m=+1023.434261339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:05 crc kubenswrapper[4743]: I0122 14:03:05.911815 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.20548001 podStartE2EDuration="1m8.911769136s" podCreationTimestamp="2026-01-22 14:01:57 +0000 UTC" firstStartedPulling="2026-01-22 14:01:59.420642143 +0000 UTC m=+955.975685306" lastFinishedPulling="2026-01-22 14:02:30.126931269 +0000 UTC m=+986.681974432" observedRunningTime="2026-01-22 14:03:05.907923772 +0000 UTC m=+1022.462966945" watchObservedRunningTime="2026-01-22 14:03:05.911769136 +0000 UTC m=+1022.466812299" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.234584 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.384834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc\") pod \"3bf57027-9e60-4f15-a193-f5278a8cb568\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.384917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p45n5\" (UniqueName: \"kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5\") pod \"3bf57027-9e60-4f15-a193-f5278a8cb568\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.385129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb\") pod \"3bf57027-9e60-4f15-a193-f5278a8cb568\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.385169 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config\") pod \"3bf57027-9e60-4f15-a193-f5278a8cb568\" (UID: \"3bf57027-9e60-4f15-a193-f5278a8cb568\") " Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.395163 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5" (OuterVolumeSpecName: "kube-api-access-p45n5") pod "3bf57027-9e60-4f15-a193-f5278a8cb568" (UID: "3bf57027-9e60-4f15-a193-f5278a8cb568"). InnerVolumeSpecName "kube-api-access-p45n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.430640 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config" (OuterVolumeSpecName: "config") pod "3bf57027-9e60-4f15-a193-f5278a8cb568" (UID: "3bf57027-9e60-4f15-a193-f5278a8cb568"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.445072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bf57027-9e60-4f15-a193-f5278a8cb568" (UID: "3bf57027-9e60-4f15-a193-f5278a8cb568"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.447666 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bf57027-9e60-4f15-a193-f5278a8cb568" (UID: "3bf57027-9e60-4f15-a193-f5278a8cb568"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.487180 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.487446 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.487532 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bf57027-9e60-4f15-a193-f5278a8cb568-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.487592 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p45n5\" (UniqueName: \"kubernetes.io/projected/3bf57027-9e60-4f15-a193-f5278a8cb568-kube-api-access-p45n5\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.833476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kvm68" event={"ID":"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe","Type":"ContainerStarted","Data":"75992c52a7511e1d2053dba8c589a83081fd8a269b9b752a3790b2fda37df1b0"} Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.833932 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.837728 4743 generic.go:334] "Generic (PLEG): container finished" podID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerID="58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196" exitCode=0 Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.838084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" event={"ID":"3bf57027-9e60-4f15-a193-f5278a8cb568","Type":"ContainerDied","Data":"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196"} Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.838134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" event={"ID":"3bf57027-9e60-4f15-a193-f5278a8cb568","Type":"ContainerDied","Data":"0ed3361b56a1657f9c17f2390205c6d8e73eeb5d47aefd2cbba3fb63cf93557f"} Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.838159 4743 scope.go:117] "RemoveContainer" containerID="58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.838341 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-c8xvw" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.861665 4743 scope.go:117] "RemoveContainer" containerID="27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.870713 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-kvm68" podStartSLOduration=2.870684276 podStartE2EDuration="2.870684276s" podCreationTimestamp="2026-01-22 14:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:06.856897444 +0000 UTC m=+1023.411940617" watchObservedRunningTime="2026-01-22 14:03:06.870684276 +0000 UTC m=+1023.425727439" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.879891 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.889144 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-c8xvw"] Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.891437 4743 scope.go:117] "RemoveContainer" containerID="58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196" Jan 22 14:03:06 crc kubenswrapper[4743]: E0122 14:03:06.892032 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196\": container with ID starting with 58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196 not found: ID does not exist" containerID="58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.892067 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196"} err="failed to get container status \"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196\": rpc error: code = NotFound desc = could not find container \"58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196\": container with ID starting with 58fb735501c2d0622c45927ac3571cdd6b7db9d889e9050bc3ec744919de1196 not found: ID does not exist" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.892100 4743 scope.go:117] "RemoveContainer" containerID="27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21" Jan 22 14:03:06 crc kubenswrapper[4743]: E0122 14:03:06.892455 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21\": container with ID starting with 27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21 not found: ID does not exist" containerID="27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.892503 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21"} err="failed to get container status \"27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21\": rpc error: code = NotFound desc = could not find container \"27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21\": container with ID starting with 27ac1b928e2a0d0edb854f46facceeef0169a9f8edeacbeec6841730be228c21 not found: ID does not exist" Jan 22 14:03:06 crc kubenswrapper[4743]: I0122 14:03:06.894610 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:06 crc kubenswrapper[4743]: E0122 14:03:06.895042 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:06 crc kubenswrapper[4743]: E0122 14:03:06.895072 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:06 crc kubenswrapper[4743]: E0122 14:03:06.895136 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:08.895111917 +0000 UTC m=+1025.450155080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:07 crc kubenswrapper[4743]: I0122 14:03:07.759443 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" path="/var/lib/kubelet/pods/3bf57027-9e60-4f15-a193-f5278a8cb568/volumes" Jan 22 14:03:07 crc kubenswrapper[4743]: I0122 14:03:07.846711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b07a577-785f-4720-919c-ef619448284a","Type":"ContainerStarted","Data":"0804ae38f62e4167c1c07121eb93319f9f39d9635c67c7ac2783cac9d3300229"} Jan 22 14:03:07 crc kubenswrapper[4743]: I0122 14:03:07.859219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 22 14:03:07 crc kubenswrapper[4743]: I0122 14:03:07.859260 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 22 14:03:07 crc kubenswrapper[4743]: I0122 14:03:07.870021 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.225908474 podStartE2EDuration="1m1.870000859s" podCreationTimestamp="2026-01-22 14:02:06 +0000 UTC" firstStartedPulling="2026-01-22 14:02:30.569330332 +0000 UTC m=+987.124373495" lastFinishedPulling="2026-01-22 14:03:07.213422727 +0000 UTC m=+1023.768465880" observedRunningTime="2026-01-22 14:03:07.862780534 +0000 UTC m=+1024.417823697" watchObservedRunningTime="2026-01-22 14:03:07.870000859 +0000 UTC m=+1024.425044022" Jan 22 14:03:08 crc kubenswrapper[4743]: I0122 14:03:08.738031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:03:08 crc kubenswrapper[4743]: I0122 14:03:08.930018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:08 crc kubenswrapper[4743]: E0122 14:03:08.930651 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:08 crc kubenswrapper[4743]: E0122 14:03:08.930674 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:08 crc kubenswrapper[4743]: E0122 14:03:08.930721 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:12.930698469 +0000 UTC m=+1029.485741632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.071225 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kg2fn"] Jan 22 14:03:09 crc kubenswrapper[4743]: E0122 14:03:09.071581 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="dnsmasq-dns" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.071597 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="dnsmasq-dns" Jan 22 14:03:09 crc kubenswrapper[4743]: E0122 14:03:09.071621 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="init" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.071627 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="init" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.071834 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf57027-9e60-4f15-a193-f5278a8cb568" containerName="dnsmasq-dns" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.072335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.075498 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.075660 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.075685 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.086912 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kg2fn"] Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.233521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.233592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.233613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.233647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.233949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.234099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.234366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7gf2\" (UniqueName: \"kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.336905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.337022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7gf2\" (UniqueName: \"kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.337249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.337517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.337617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.341816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.341772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.342295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.358695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7gf2\" (UniqueName: \"kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2\") pod \"swift-ring-rebalance-kg2fn\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.391227 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.833270 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kg2fn"] Jan 22 14:03:09 crc kubenswrapper[4743]: W0122 14:03:09.840091 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56dff5fb_e22c_4045_b3c4_c75e018df046.slice/crio-355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4 WatchSource:0}: Error finding container 355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4: Status 404 returned error can't find the container with id 355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4 Jan 22 14:03:09 crc kubenswrapper[4743]: I0122 14:03:09.871360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kg2fn" event={"ID":"56dff5fb-e22c-4045-b3c4-c75e018df046","Type":"ContainerStarted","Data":"355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4"} Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.041654 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.041693 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.082923 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d6mgf"] Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.084217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.087398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.097737 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d6mgf"] Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.158762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58zt\" (UniqueName: \"kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.159065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.162558 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.260580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58zt\" (UniqueName: \"kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.260698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.261735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.277654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58zt\" (UniqueName: \"kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt\") pod \"root-account-create-update-d6mgf\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.413102 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.910579 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d6mgf"] Jan 22 14:03:10 crc kubenswrapper[4743]: I0122 14:03:10.955438 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.027726 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.381560 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xwzhs"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.383608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.394839 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xwzhs"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.488600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8cc\" (UniqueName: \"kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.488920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.506406 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7c57-account-create-update-q694h"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.507391 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.509528 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.520474 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c57-account-create-update-q694h"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.590535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8cc\" (UniqueName: \"kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.590598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hlw\" (UniqueName: \"kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.590695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.590745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.591561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.625354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8cc\" (UniqueName: \"kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc\") pod \"placement-db-create-xwzhs\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.655313 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8dx2h"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.656355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.664733 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8dx2h"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.715835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hlw\" (UniqueName: \"kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.715918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.716077 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcm5v\" (UniqueName: \"kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.716128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.716339 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.716973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.739506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hlw\" (UniqueName: \"kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw\") pod \"placement-7c57-account-create-update-q694h\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.782421 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0148-account-create-update-wmj9x"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.783901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.790039 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0148-account-create-update-wmj9x"] Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.792550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.817875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.818102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcm5v\" (UniqueName: \"kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.818659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.824002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.841636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcm5v\" (UniqueName: \"kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v\") pod \"glance-db-create-8dx2h\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.895757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d6mgf" event={"ID":"41bb2a83-e99f-4f6c-9c46-10946d57790a","Type":"ContainerStarted","Data":"99d925b658db1a220feca7152975bba687878df479c6ef533b3ce009a48a6da2"} Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.896028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d6mgf" event={"ID":"41bb2a83-e99f-4f6c-9c46-10946d57790a","Type":"ContainerStarted","Data":"7173441eb68f61a2f8d7129f961cba805c1610395bc7a2f3c4b88a9d397ed78f"} Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.912668 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-d6mgf" podStartSLOduration=1.912651133 podStartE2EDuration="1.912651133s" podCreationTimestamp="2026-01-22 14:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:11.910885545 +0000 UTC m=+1028.465928708" watchObservedRunningTime="2026-01-22 14:03:11.912651133 +0000 UTC m=+1028.467694296" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.919702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxvt\" (UniqueName: \"kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:11 crc kubenswrapper[4743]: I0122 14:03:11.919972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.024679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxvt\" (UniqueName: \"kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.025115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.026388 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.056978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxvt\" (UniqueName: \"kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt\") pod \"glance-0148-account-create-update-wmj9x\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.116683 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.133032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.258066 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xwzhs"] Jan 22 14:03:12 crc kubenswrapper[4743]: W0122 14:03:12.277200 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode41c39cc_ec68_49e7_8144_d58dbccf371b.slice/crio-dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b WatchSource:0}: Error finding container dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b: Status 404 returned error can't find the container with id dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.410826 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c57-account-create-update-q694h"] Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.493702 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0148-account-create-update-wmj9x"] Jan 22 14:03:12 crc kubenswrapper[4743]: W0122 14:03:12.507599 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf7689c6_7604_44cf_86aa_a317e32537e3.slice/crio-3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f WatchSource:0}: Error finding container 3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f: Status 404 returned error can't find the container with id 3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f Jan 22 14:03:12 crc kubenswrapper[4743]: W0122 14:03:12.510497 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a3713e3_dd7a_4209_bda1_ce7bcb652e1a.slice/crio-be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2 WatchSource:0}: Error finding container be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2: Status 404 returned error can't find the container with id be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2 Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.835244 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8dx2h"] Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.911962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c57-account-create-update-q694h" event={"ID":"bf7689c6-7604-44cf-86aa-a317e32537e3","Type":"ContainerStarted","Data":"fd7487ae1682bc8dc72c180745cd112dcf07dd8bb38751a1a686f56520f7e7cb"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.912023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c57-account-create-update-q694h" event={"ID":"bf7689c6-7604-44cf-86aa-a317e32537e3","Type":"ContainerStarted","Data":"3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.914294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0148-account-create-update-wmj9x" event={"ID":"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a","Type":"ContainerStarted","Data":"8322bd5586c64407f7ad3c207ed2fc535445bd76879430605bb66784313ff434"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.914347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0148-account-create-update-wmj9x" event={"ID":"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a","Type":"ContainerStarted","Data":"be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.915986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8dx2h" event={"ID":"d4db18a3-97d3-4f11-b1c3-f10626ae1fea","Type":"ContainerStarted","Data":"6db4014a44a539ed55b72154939456ea74ce12fa36111eb119b4607f06eded7e"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.917750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xwzhs" event={"ID":"e41c39cc-ec68-49e7-8144-d58dbccf371b","Type":"ContainerStarted","Data":"ef7bec575cb03a7715870bb9ca0983afe34e8c417201961051b4a36393c800ed"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.917855 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xwzhs" event={"ID":"e41c39cc-ec68-49e7-8144-d58dbccf371b","Type":"ContainerStarted","Data":"dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.920435 4743 generic.go:334] "Generic (PLEG): container finished" podID="41bb2a83-e99f-4f6c-9c46-10946d57790a" containerID="99d925b658db1a220feca7152975bba687878df479c6ef533b3ce009a48a6da2" exitCode=0 Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.920488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d6mgf" event={"ID":"41bb2a83-e99f-4f6c-9c46-10946d57790a","Type":"ContainerDied","Data":"99d925b658db1a220feca7152975bba687878df479c6ef533b3ce009a48a6da2"} Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.946310 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.953419 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7c57-account-create-update-q694h" podStartSLOduration=1.953403094 podStartE2EDuration="1.953403094s" podCreationTimestamp="2026-01-22 14:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:12.94735779 +0000 UTC m=+1029.502400953" watchObservedRunningTime="2026-01-22 14:03:12.953403094 +0000 UTC m=+1029.508446257" Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.956594 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:12 crc kubenswrapper[4743]: E0122 14:03:12.958494 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:12 crc kubenswrapper[4743]: E0122 14:03:12.958536 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:12 crc kubenswrapper[4743]: E0122 14:03:12.958604 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:20.958579724 +0000 UTC m=+1037.513622947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:12 crc kubenswrapper[4743]: I0122 14:03:12.969746 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-0148-account-create-update-wmj9x" podStartSLOduration=1.969715434 podStartE2EDuration="1.969715434s" podCreationTimestamp="2026-01-22 14:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:12.966124447 +0000 UTC m=+1029.521167610" watchObservedRunningTime="2026-01-22 14:03:12.969715434 +0000 UTC m=+1029.524758597" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.055120 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-xwzhs" podStartSLOduration=2.055101482 podStartE2EDuration="2.055101482s" podCreationTimestamp="2026-01-22 14:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:13.052672766 +0000 UTC m=+1029.607715939" watchObservedRunningTime="2026-01-22 14:03:13.055101482 +0000 UTC m=+1029.610144645" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.160417 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.162177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.164714 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q4rsr" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.165542 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.164888 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.165399 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.178104 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.262595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.262832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx295\" (UniqueName: \"kubernetes.io/projected/5c926afa-42b3-4fc2-bc38-8ee725cd113b-kube-api-access-hx295\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.263021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-scripts\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.263194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.263401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.263691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-config\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.263725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx295\" (UniqueName: \"kubernetes.io/projected/5c926afa-42b3-4fc2-bc38-8ee725cd113b-kube-api-access-hx295\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-scripts\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-config\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.365915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.366458 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.367343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-config\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.367731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c926afa-42b3-4fc2-bc38-8ee725cd113b-scripts\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.371519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.375957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.391108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c926afa-42b3-4fc2-bc38-8ee725cd113b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.391436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx295\" (UniqueName: \"kubernetes.io/projected/5c926afa-42b3-4fc2-bc38-8ee725cd113b-kube-api-access-hx295\") pod \"ovn-northd-0\" (UID: \"5c926afa-42b3-4fc2-bc38-8ee725cd113b\") " pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.492706 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.929753 4743 generic.go:334] "Generic (PLEG): container finished" podID="e41c39cc-ec68-49e7-8144-d58dbccf371b" containerID="ef7bec575cb03a7715870bb9ca0983afe34e8c417201961051b4a36393c800ed" exitCode=0 Jan 22 14:03:13 crc kubenswrapper[4743]: I0122 14:03:13.929879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xwzhs" event={"ID":"e41c39cc-ec68-49e7-8144-d58dbccf371b","Type":"ContainerDied","Data":"ef7bec575cb03a7715870bb9ca0983afe34e8c417201961051b4a36393c800ed"} Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.000838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.292945 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.384339 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v58zt\" (UniqueName: \"kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt\") pod \"41bb2a83-e99f-4f6c-9c46-10946d57790a\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.384403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts\") pod \"41bb2a83-e99f-4f6c-9c46-10946d57790a\" (UID: \"41bb2a83-e99f-4f6c-9c46-10946d57790a\") " Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.384975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41bb2a83-e99f-4f6c-9c46-10946d57790a" (UID: "41bb2a83-e99f-4f6c-9c46-10946d57790a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.389590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt" (OuterVolumeSpecName: "kube-api-access-v58zt") pod "41bb2a83-e99f-4f6c-9c46-10946d57790a" (UID: "41bb2a83-e99f-4f6c-9c46-10946d57790a"). InnerVolumeSpecName "kube-api-access-v58zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.496993 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v58zt\" (UniqueName: \"kubernetes.io/projected/41bb2a83-e99f-4f6c-9c46-10946d57790a-kube-api-access-v58zt\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.497039 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41bb2a83-e99f-4f6c-9c46-10946d57790a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.562036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.620894 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.621194 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="dnsmasq-dns" containerID="cri-o://d1a9a9135ea3be885bb0b3b6f71c51751367a01b996f7ba1d642c273cb78ef95" gracePeriod=10 Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.938637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c926afa-42b3-4fc2-bc38-8ee725cd113b","Type":"ContainerStarted","Data":"0360337463a3b423f9a285d69bd849f1b1b167a1c105616c387aa2ec2d59bda3"} Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.940744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d6mgf" event={"ID":"41bb2a83-e99f-4f6c-9c46-10946d57790a","Type":"ContainerDied","Data":"7173441eb68f61a2f8d7129f961cba805c1610395bc7a2f3c4b88a9d397ed78f"} Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.940806 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7173441eb68f61a2f8d7129f961cba805c1610395bc7a2f3c4b88a9d397ed78f" Jan 22 14:03:14 crc kubenswrapper[4743]: I0122 14:03:14.940812 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d6mgf" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.336760 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.412031 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts\") pod \"e41c39cc-ec68-49e7-8144-d58dbccf371b\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.412089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc8cc\" (UniqueName: \"kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc\") pod \"e41c39cc-ec68-49e7-8144-d58dbccf371b\" (UID: \"e41c39cc-ec68-49e7-8144-d58dbccf371b\") " Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.412860 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e41c39cc-ec68-49e7-8144-d58dbccf371b" (UID: "e41c39cc-ec68-49e7-8144-d58dbccf371b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.424369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc" (OuterVolumeSpecName: "kube-api-access-cc8cc") pod "e41c39cc-ec68-49e7-8144-d58dbccf371b" (UID: "e41c39cc-ec68-49e7-8144-d58dbccf371b"). InnerVolumeSpecName "kube-api-access-cc8cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.514109 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e41c39cc-ec68-49e7-8144-d58dbccf371b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.514291 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc8cc\" (UniqueName: \"kubernetes.io/projected/e41c39cc-ec68-49e7-8144-d58dbccf371b-kube-api-access-cc8cc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.952650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xwzhs" event={"ID":"e41c39cc-ec68-49e7-8144-d58dbccf371b","Type":"ContainerDied","Data":"dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b"} Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.952689 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd310b4bb8d0d8cfa808b8a279e2bbb78d2e0f81f87573fb0046df7f57c9b83b" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.952751 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xwzhs" Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.968319 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf7689c6-7604-44cf-86aa-a317e32537e3" containerID="fd7487ae1682bc8dc72c180745cd112dcf07dd8bb38751a1a686f56520f7e7cb" exitCode=0 Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.968423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c57-account-create-update-q694h" event={"ID":"bf7689c6-7604-44cf-86aa-a317e32537e3","Type":"ContainerDied","Data":"fd7487ae1682bc8dc72c180745cd112dcf07dd8bb38751a1a686f56520f7e7cb"} Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.980313 4743 generic.go:334] "Generic (PLEG): container finished" podID="4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" containerID="8322bd5586c64407f7ad3c207ed2fc535445bd76879430605bb66784313ff434" exitCode=0 Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.980446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0148-account-create-update-wmj9x" event={"ID":"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a","Type":"ContainerDied","Data":"8322bd5586c64407f7ad3c207ed2fc535445bd76879430605bb66784313ff434"} Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.990372 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4db18a3-97d3-4f11-b1c3-f10626ae1fea" containerID="253189d4f0506a0db062f798443f85cacf745844b491490253e6940bc075bcb2" exitCode=0 Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.990425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8dx2h" event={"ID":"d4db18a3-97d3-4f11-b1c3-f10626ae1fea","Type":"ContainerDied","Data":"253189d4f0506a0db062f798443f85cacf745844b491490253e6940bc075bcb2"} Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.992227 4743 generic.go:334] "Generic (PLEG): container finished" podID="cd315b7b-42c7-4482-b739-5b003bf02430" containerID="d1a9a9135ea3be885bb0b3b6f71c51751367a01b996f7ba1d642c273cb78ef95" exitCode=0 Jan 22 14:03:15 crc kubenswrapper[4743]: I0122 14:03:15.992249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" event={"ID":"cd315b7b-42c7-4482-b739-5b003bf02430","Type":"ContainerDied","Data":"d1a9a9135ea3be885bb0b3b6f71c51751367a01b996f7ba1d642c273cb78ef95"} Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.467828 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.475378 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.485974 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.495309 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.557976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxvt\" (UniqueName: \"kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt\") pod \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb\") pod \"cd315b7b-42c7-4482-b739-5b003bf02430\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts\") pod \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcm5v\" (UniqueName: \"kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v\") pod \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\" (UID: \"d4db18a3-97d3-4f11-b1c3-f10626ae1fea\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558157 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blkxr\" (UniqueName: \"kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr\") pod \"cd315b7b-42c7-4482-b739-5b003bf02430\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts\") pod \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\" (UID: \"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc\") pod \"cd315b7b-42c7-4482-b739-5b003bf02430\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts\") pod \"bf7689c6-7604-44cf-86aa-a317e32537e3\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7hlw\" (UniqueName: \"kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw\") pod \"bf7689c6-7604-44cf-86aa-a317e32537e3\" (UID: \"bf7689c6-7604-44cf-86aa-a317e32537e3\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config\") pod \"cd315b7b-42c7-4482-b739-5b003bf02430\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558380 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb\") pod \"cd315b7b-42c7-4482-b739-5b003bf02430\" (UID: \"cd315b7b-42c7-4482-b739-5b003bf02430\") " Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.558764 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4db18a3-97d3-4f11-b1c3-f10626ae1fea" (UID: "d4db18a3-97d3-4f11-b1c3-f10626ae1fea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.559117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" (UID: "4a3713e3-dd7a-4209-bda1-ce7bcb652e1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.559830 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf7689c6-7604-44cf-86aa-a317e32537e3" (UID: "bf7689c6-7604-44cf-86aa-a317e32537e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.563989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw" (OuterVolumeSpecName: "kube-api-access-h7hlw") pod "bf7689c6-7604-44cf-86aa-a317e32537e3" (UID: "bf7689c6-7604-44cf-86aa-a317e32537e3"). InnerVolumeSpecName "kube-api-access-h7hlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.564041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v" (OuterVolumeSpecName: "kube-api-access-pcm5v") pod "d4db18a3-97d3-4f11-b1c3-f10626ae1fea" (UID: "d4db18a3-97d3-4f11-b1c3-f10626ae1fea"). InnerVolumeSpecName "kube-api-access-pcm5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.564090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr" (OuterVolumeSpecName: "kube-api-access-blkxr") pod "cd315b7b-42c7-4482-b739-5b003bf02430" (UID: "cd315b7b-42c7-4482-b739-5b003bf02430"). InnerVolumeSpecName "kube-api-access-blkxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.566976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt" (OuterVolumeSpecName: "kube-api-access-wrxvt") pod "4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" (UID: "4a3713e3-dd7a-4209-bda1-ce7bcb652e1a"). InnerVolumeSpecName "kube-api-access-wrxvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.599040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config" (OuterVolumeSpecName: "config") pod "cd315b7b-42c7-4482-b739-5b003bf02430" (UID: "cd315b7b-42c7-4482-b739-5b003bf02430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.600354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd315b7b-42c7-4482-b739-5b003bf02430" (UID: "cd315b7b-42c7-4482-b739-5b003bf02430"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.601650 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd315b7b-42c7-4482-b739-5b003bf02430" (UID: "cd315b7b-42c7-4482-b739-5b003bf02430"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.603780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd315b7b-42c7-4482-b739-5b003bf02430" (UID: "cd315b7b-42c7-4482-b739-5b003bf02430"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660027 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660061 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660070 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcm5v\" (UniqueName: \"kubernetes.io/projected/d4db18a3-97d3-4f11-b1c3-f10626ae1fea-kube-api-access-pcm5v\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660081 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blkxr\" (UniqueName: \"kubernetes.io/projected/cd315b7b-42c7-4482-b739-5b003bf02430-kube-api-access-blkxr\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660090 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660098 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660106 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7689c6-7604-44cf-86aa-a317e32537e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7hlw\" (UniqueName: \"kubernetes.io/projected/bf7689c6-7604-44cf-86aa-a317e32537e3-kube-api-access-h7hlw\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660123 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660131 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd315b7b-42c7-4482-b739-5b003bf02430-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:17 crc kubenswrapper[4743]: I0122 14:03:17.660140 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxvt\" (UniqueName: \"kubernetes.io/projected/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a-kube-api-access-wrxvt\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.015993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c57-account-create-update-q694h" event={"ID":"bf7689c6-7604-44cf-86aa-a317e32537e3","Type":"ContainerDied","Data":"3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f"} Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.016093 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d787f790fbd980ae8c049bab60db0ae7642b959c921efb95961a8a77a60559f" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.016187 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c57-account-create-update-q694h" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.018546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0148-account-create-update-wmj9x" event={"ID":"4a3713e3-dd7a-4209-bda1-ce7bcb652e1a","Type":"ContainerDied","Data":"be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2"} Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.018577 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be2b60f2a409fb014b383e93e22f8f26fb76f03dc25f9ce701d90eaaaefd42f2" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.019928 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0148-account-create-update-wmj9x" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.025951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8dx2h" event={"ID":"d4db18a3-97d3-4f11-b1c3-f10626ae1fea","Type":"ContainerDied","Data":"6db4014a44a539ed55b72154939456ea74ce12fa36111eb119b4607f06eded7e"} Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.026176 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db4014a44a539ed55b72154939456ea74ce12fa36111eb119b4607f06eded7e" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.026312 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8dx2h" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.033289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" event={"ID":"cd315b7b-42c7-4482-b739-5b003bf02430","Type":"ContainerDied","Data":"5a48bc21465d71d0046f3135194b17a064565b561a52002dd1af522e2d411487"} Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.033722 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jnjh" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.033769 4743 scope.go:117] "RemoveContainer" containerID="d1a9a9135ea3be885bb0b3b6f71c51751367a01b996f7ba1d642c273cb78ef95" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.061374 4743 scope.go:117] "RemoveContainer" containerID="51c01670deecbf0586c0c1c9cfbd272866f29288682883c2ba2dcd476ac9cc0a" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.062699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.070064 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jnjh"] Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.680570 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d6mgf"] Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.687105 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d6mgf"] Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.790369 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pls6v"] Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.790835 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41c39cc-ec68-49e7-8144-d58dbccf371b" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.790916 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41c39cc-ec68-49e7-8144-d58dbccf371b" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.790991 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="dnsmasq-dns" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791052 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="dnsmasq-dns" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.791113 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791169 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.791243 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="init" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791308 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="init" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.791363 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7689c6-7604-44cf-86aa-a317e32537e3" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791417 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7689c6-7604-44cf-86aa-a317e32537e3" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.791477 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bb2a83-e99f-4f6c-9c46-10946d57790a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791532 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bb2a83-e99f-4f6c-9c46-10946d57790a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: E0122 14:03:18.791590 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4db18a3-97d3-4f11-b1c3-f10626ae1fea" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.791768 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4db18a3-97d3-4f11-b1c3-f10626ae1fea" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792039 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" containerName="dnsmasq-dns" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792126 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792184 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41c39cc-ec68-49e7-8144-d58dbccf371b" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792245 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4db18a3-97d3-4f11-b1c3-f10626ae1fea" containerName="mariadb-database-create" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792305 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7689c6-7604-44cf-86aa-a317e32537e3" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.792367 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bb2a83-e99f-4f6c-9c46-10946d57790a" containerName="mariadb-account-create-update" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.793056 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.793449 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pls6v"] Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.805149 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.830140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qt5\" (UniqueName: \"kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.830225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.895823 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.933675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qt5\" (UniqueName: \"kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.933832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.934746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.962423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qt5\" (UniqueName: \"kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5\") pod \"root-account-create-update-pls6v\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:18 crc kubenswrapper[4743]: I0122 14:03:18.969112 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.055704 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c926afa-42b3-4fc2-bc38-8ee725cd113b","Type":"ContainerStarted","Data":"93476a51d19792561492cb7aa11fd4ed07ca84deae6b04cdca382afcacc9dffc"} Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.075593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kg2fn" event={"ID":"56dff5fb-e22c-4045-b3c4-c75e018df046","Type":"ContainerStarted","Data":"3a8590dd76f4486817876fbdd0d018ab916f47d8d9fa6c9e0ea2631339a07c71"} Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.100638 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kg2fn" podStartSLOduration=1.967838784 podStartE2EDuration="10.100621045s" podCreationTimestamp="2026-01-22 14:03:09 +0000 UTC" firstStartedPulling="2026-01-22 14:03:09.842379363 +0000 UTC m=+1026.397422526" lastFinishedPulling="2026-01-22 14:03:17.975161624 +0000 UTC m=+1034.530204787" observedRunningTime="2026-01-22 14:03:19.098310152 +0000 UTC m=+1035.653353315" watchObservedRunningTime="2026-01-22 14:03:19.100621045 +0000 UTC m=+1035.655664208" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.147371 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.210009 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5zstz"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.211280 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.227813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5zstz"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.271260 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1f94-account-create-update-2br2j"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.272506 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.276483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.279297 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1f94-account-create-update-2br2j"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.344421 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4n5\" (UniqueName: \"kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.344706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.418187 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8xtkr"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.419283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.438093 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8xtkr"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.442516 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-12cb-account-create-update-c67vm"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.446710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.446884 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.449390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5m9\" (UniqueName: \"kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.449574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4n5\" (UniqueName: \"kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.449703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.450656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.460141 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.483976 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-12cb-account-create-update-c67vm"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.545642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4n5\" (UniqueName: \"kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5\") pod \"barbican-db-create-5zstz\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.552588 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.555803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wth2c\" (UniqueName: \"kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.555904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqfz\" (UniqueName: \"kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.555950 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5m9\" (UniqueName: \"kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.555997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.556202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.556274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.556428 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dbz7j"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.557468 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.558124 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.568597 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dbz7j"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.589016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5m9\" (UniqueName: \"kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9\") pod \"barbican-1f94-account-create-update-2br2j\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.596325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wth2c\" (UniqueName: \"kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjxn\" (UniqueName: \"kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658917 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqfz\" (UniqueName: \"kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.658977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.660013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.660500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.662636 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a292-account-create-update-zfn7t"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.663691 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.666358 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.674613 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a292-account-create-update-zfn7t"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.679351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqfz\" (UniqueName: \"kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz\") pod \"cinder-12cb-account-create-update-c67vm\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.683724 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wth2c\" (UniqueName: \"kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c\") pod \"cinder-db-create-8xtkr\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.736550 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pls6v"] Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.753206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.760897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.761029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.761121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhx4\" (UniqueName: \"kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.761263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjxn\" (UniqueName: \"kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.762416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.775102 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.794064 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41bb2a83-e99f-4f6c-9c46-10946d57790a" path="/var/lib/kubelet/pods/41bb2a83-e99f-4f6c-9c46-10946d57790a/volumes" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.794688 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd315b7b-42c7-4482-b739-5b003bf02430" path="/var/lib/kubelet/pods/cd315b7b-42c7-4482-b739-5b003bf02430/volumes" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.803410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjxn\" (UniqueName: \"kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn\") pod \"neutron-db-create-dbz7j\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.866753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.867343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhx4\" (UniqueName: \"kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.867467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:19 crc kubenswrapper[4743]: I0122 14:03:19.891120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhx4\" (UniqueName: \"kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4\") pod \"neutron-a292-account-create-update-zfn7t\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.015680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.031493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.100374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c926afa-42b3-4fc2-bc38-8ee725cd113b","Type":"ContainerStarted","Data":"26971c6b08163edb629523383bfd582f3ff436786414d01aea077c7a0592ca55"} Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.100959 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.105516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pls6v" event={"ID":"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c","Type":"ContainerStarted","Data":"731fbfa7cd91f9bb40b16b67aabb061a127fac764176c44112eefde91d0003e5"} Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.105554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pls6v" event={"ID":"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c","Type":"ContainerStarted","Data":"9dc9a61da3ba506fe53814cad3f9e90b22046971114d7c87d9fd822954d93431"} Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.121453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5zstz"] Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.123798 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.439752195 podStartE2EDuration="7.12377505s" podCreationTimestamp="2026-01-22 14:03:13 +0000 UTC" firstStartedPulling="2026-01-22 14:03:14.013433726 +0000 UTC m=+1030.568476889" lastFinishedPulling="2026-01-22 14:03:18.697456581 +0000 UTC m=+1035.252499744" observedRunningTime="2026-01-22 14:03:20.120689857 +0000 UTC m=+1036.675733030" watchObservedRunningTime="2026-01-22 14:03:20.12377505 +0000 UTC m=+1036.678818213" Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.145015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-pls6v" podStartSLOduration=2.144996213 podStartE2EDuration="2.144996213s" podCreationTimestamp="2026-01-22 14:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:20.144188262 +0000 UTC m=+1036.699231425" watchObservedRunningTime="2026-01-22 14:03:20.144996213 +0000 UTC m=+1036.700039366" Jan 22 14:03:20 crc kubenswrapper[4743]: W0122 14:03:20.148195 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f9f2d7_4dc3_45e2_98a0_1058cb22cf46.slice/crio-d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa WatchSource:0}: Error finding container d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa: Status 404 returned error can't find the container with id d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.217717 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1f94-account-create-update-2br2j"] Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.337584 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-12cb-account-create-update-c67vm"] Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.353895 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8xtkr"] Jan 22 14:03:20 crc kubenswrapper[4743]: W0122 14:03:20.379915 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc806e874_b77d_4da6_8608_cbfac8bde50a.slice/crio-034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45 WatchSource:0}: Error finding container 034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45: Status 404 returned error can't find the container with id 034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45 Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.559981 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dbz7j"] Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.666858 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a292-account-create-update-zfn7t"] Jan 22 14:03:20 crc kubenswrapper[4743]: I0122 14:03:20.987976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:20 crc kubenswrapper[4743]: E0122 14:03:20.988145 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 22 14:03:20 crc kubenswrapper[4743]: E0122 14:03:20.988283 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 22 14:03:20 crc kubenswrapper[4743]: E0122 14:03:20.988358 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift podName:338e196f-7c64-4cbd-b058-768ccb4c5df9 nodeName:}" failed. No retries permitted until 2026-01-22 14:03:36.988330051 +0000 UTC m=+1053.543373214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift") pod "swift-storage-0" (UID: "338e196f-7c64-4cbd-b058-768ccb4c5df9") : configmap "swift-ring-files" not found Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.065087 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-27mjs"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.066254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.079052 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-27mjs"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.120512 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" containerID="731fbfa7cd91f9bb40b16b67aabb061a127fac764176c44112eefde91d0003e5" exitCode=0 Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.120628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pls6v" event={"ID":"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c","Type":"ContainerDied","Data":"731fbfa7cd91f9bb40b16b67aabb061a127fac764176c44112eefde91d0003e5"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.123147 4743 generic.go:334] "Generic (PLEG): container finished" podID="e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" containerID="50ce7f8f971e46ab830c65837f9e13d105ed15d285f6ae834e28e9fc5f361e8b" exitCode=0 Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.123315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5zstz" event={"ID":"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46","Type":"ContainerDied","Data":"50ce7f8f971e46ab830c65837f9e13d105ed15d285f6ae834e28e9fc5f361e8b"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.123352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5zstz" event={"ID":"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46","Type":"ContainerStarted","Data":"d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.125233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8xtkr" event={"ID":"c806e874-b77d-4da6-8608-cbfac8bde50a","Type":"ContainerStarted","Data":"1e3b2d87b41ee7e7b367552372673bf101764faea25e45f65fe5e26a8cc02a1a"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.125272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8xtkr" event={"ID":"c806e874-b77d-4da6-8608-cbfac8bde50a","Type":"ContainerStarted","Data":"034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.126865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a292-account-create-update-zfn7t" event={"ID":"668500b3-5335-4fd9-992b-0b0111284379","Type":"ContainerStarted","Data":"1949595cd193cf122023ac76733cceeebeed2e950d0688bbfa604de50157fb04"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.126903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a292-account-create-update-zfn7t" event={"ID":"668500b3-5335-4fd9-992b-0b0111284379","Type":"ContainerStarted","Data":"6c0872e4ba16df77acb8685b1155c9de7ef4bf9af8daf12ba131fb940010f440"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.131145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbz7j" event={"ID":"94d531c9-e092-442a-8c4b-044fcf12ac9e","Type":"ContainerStarted","Data":"2d1e76033c4d9774442c9d17766c5b95d1c0c6a00ae5c8d3f2e3a3dcc1de4f10"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.131201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbz7j" event={"ID":"94d531c9-e092-442a-8c4b-044fcf12ac9e","Type":"ContainerStarted","Data":"d5fadacbbc95f6390529ade25c2b9a467bd3e20104605eca4c560bfc03a44548"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.138398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12cb-account-create-update-c67vm" event={"ID":"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9","Type":"ContainerStarted","Data":"dc09aecb4a3e842279a44df7d6af19e634645bb52573663bcc945e520a390975"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.138462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12cb-account-create-update-c67vm" event={"ID":"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9","Type":"ContainerStarted","Data":"54aefc434dd69c4eee97f4e25a1c1b16b77611dbfe3cffd3f257e6cdb298dc2b"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.142650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1f94-account-create-update-2br2j" event={"ID":"7a0a522c-1c7c-4b39-94bf-d741ea969082","Type":"ContainerStarted","Data":"679fa311004f4d8dad216b47619656bc6856f50316dd8b4c3a1c27880d35db9b"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.142710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1f94-account-create-update-2br2j" event={"ID":"7a0a522c-1c7c-4b39-94bf-d741ea969082","Type":"ContainerStarted","Data":"4b43d837ec918a565c002fde7d4c157eb2fd6ea6cc20bf99a1f380ae7baa9657"} Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.171876 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e41f-account-create-update-4rrgn"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.173377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.185885 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.195153 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.195254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b64g\" (UniqueName: \"kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.211891 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e41f-account-create-update-4rrgn"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.232169 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-a292-account-create-update-zfn7t" podStartSLOduration=2.2321466389999998 podStartE2EDuration="2.232146639s" podCreationTimestamp="2026-01-22 14:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:21.208342956 +0000 UTC m=+1037.763386119" watchObservedRunningTime="2026-01-22 14:03:21.232146639 +0000 UTC m=+1037.787189812" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.287104 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-12cb-account-create-update-c67vm" podStartSLOduration=2.287090314 podStartE2EDuration="2.287090314s" podCreationTimestamp="2026-01-22 14:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:21.282761527 +0000 UTC m=+1037.837804690" watchObservedRunningTime="2026-01-22 14:03:21.287090314 +0000 UTC m=+1037.842133477" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.296760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b64g\" (UniqueName: \"kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.296867 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.296934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwrt\" (UniqueName: \"kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.297110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.297871 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.312307 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-1f94-account-create-update-2br2j" podStartSLOduration=2.3122850440000002 podStartE2EDuration="2.312285044s" podCreationTimestamp="2026-01-22 14:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:21.299509489 +0000 UTC m=+1037.854552662" watchObservedRunningTime="2026-01-22 14:03:21.312285044 +0000 UTC m=+1037.867328217" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.327460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b64g\" (UniqueName: \"kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g\") pod \"keystone-db-create-27mjs\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.389379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.398782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwrt\" (UniqueName: \"kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.399001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.399869 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.422025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwrt\" (UniqueName: \"kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt\") pod \"keystone-e41f-account-create-update-4rrgn\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.611815 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.854870 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-27mjs"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.997772 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nm9l6"] Jan 22 14:03:21 crc kubenswrapper[4743]: I0122 14:03:21.999777 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.002838 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7dmrc" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.002947 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.016609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p472\" (UniqueName: \"kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.017003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.017154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.017329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.026054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nm9l6"] Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.119070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p472\" (UniqueName: \"kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.119471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.119609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.119749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.127387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.128295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.128326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.148532 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e41f-account-create-update-4rrgn"] Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.148584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p472\" (UniqueName: \"kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472\") pod \"glance-db-sync-nm9l6\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.154223 4743 generic.go:334] "Generic (PLEG): container finished" podID="c806e874-b77d-4da6-8608-cbfac8bde50a" containerID="1e3b2d87b41ee7e7b367552372673bf101764faea25e45f65fe5e26a8cc02a1a" exitCode=0 Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.154302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8xtkr" event={"ID":"c806e874-b77d-4da6-8608-cbfac8bde50a","Type":"ContainerDied","Data":"1e3b2d87b41ee7e7b367552372673bf101764faea25e45f65fe5e26a8cc02a1a"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.156853 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-27mjs" event={"ID":"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd","Type":"ContainerStarted","Data":"75dd059d200de49e276d1e6a2c754dcbc258ee37a2b0c93be0fd84aaa798fcf0"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.156961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-27mjs" event={"ID":"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd","Type":"ContainerStarted","Data":"91b4a2711c567af8944b47b7c6917113d1ee2ca59f4185d765ad034e49114081"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.158466 4743 generic.go:334] "Generic (PLEG): container finished" podID="668500b3-5335-4fd9-992b-0b0111284379" containerID="1949595cd193cf122023ac76733cceeebeed2e950d0688bbfa604de50157fb04" exitCode=0 Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.158821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a292-account-create-update-zfn7t" event={"ID":"668500b3-5335-4fd9-992b-0b0111284379","Type":"ContainerDied","Data":"1949595cd193cf122023ac76733cceeebeed2e950d0688bbfa604de50157fb04"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.166249 4743 generic.go:334] "Generic (PLEG): container finished" podID="94d531c9-e092-442a-8c4b-044fcf12ac9e" containerID="2d1e76033c4d9774442c9d17766c5b95d1c0c6a00ae5c8d3f2e3a3dcc1de4f10" exitCode=0 Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.166563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbz7j" event={"ID":"94d531c9-e092-442a-8c4b-044fcf12ac9e","Type":"ContainerDied","Data":"2d1e76033c4d9774442c9d17766c5b95d1c0c6a00ae5c8d3f2e3a3dcc1de4f10"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.169647 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" containerID="dc09aecb4a3e842279a44df7d6af19e634645bb52573663bcc945e520a390975" exitCode=0 Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.169862 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12cb-account-create-update-c67vm" event={"ID":"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9","Type":"ContainerDied","Data":"dc09aecb4a3e842279a44df7d6af19e634645bb52573663bcc945e520a390975"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.172833 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a0a522c-1c7c-4b39-94bf-d741ea969082" containerID="679fa311004f4d8dad216b47619656bc6856f50316dd8b4c3a1c27880d35db9b" exitCode=0 Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.173176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1f94-account-create-update-2br2j" event={"ID":"7a0a522c-1c7c-4b39-94bf-d741ea969082","Type":"ContainerDied","Data":"679fa311004f4d8dad216b47619656bc6856f50316dd8b4c3a1c27880d35db9b"} Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.185000 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-27mjs" podStartSLOduration=1.184967445 podStartE2EDuration="1.184967445s" podCreationTimestamp="2026-01-22 14:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:22.174121972 +0000 UTC m=+1038.729165135" watchObservedRunningTime="2026-01-22 14:03:22.184967445 +0000 UTC m=+1038.740010608" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.321738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nm9l6" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.643938 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.818311 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.824488 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.831367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.838504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wth2c\" (UniqueName: \"kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c\") pod \"c806e874-b77d-4da6-8608-cbfac8bde50a\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.838639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts\") pod \"c806e874-b77d-4da6-8608-cbfac8bde50a\" (UID: \"c806e874-b77d-4da6-8608-cbfac8bde50a\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.841152 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c806e874-b77d-4da6-8608-cbfac8bde50a" (UID: "c806e874-b77d-4da6-8608-cbfac8bde50a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.880417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c" (OuterVolumeSpecName: "kube-api-access-wth2c") pod "c806e874-b77d-4da6-8608-cbfac8bde50a" (UID: "c806e874-b77d-4da6-8608-cbfac8bde50a"). InnerVolumeSpecName "kube-api-access-wth2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940028 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts\") pod \"94d531c9-e092-442a-8c4b-044fcf12ac9e\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts\") pod \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940182 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts\") pod \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4n5\" (UniqueName: \"kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5\") pod \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\" (UID: \"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940334 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzjxn\" (UniqueName: \"kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn\") pod \"94d531c9-e092-442a-8c4b-044fcf12ac9e\" (UID: \"94d531c9-e092-442a-8c4b-044fcf12ac9e\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2qt5\" (UniqueName: \"kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5\") pod \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\" (UID: \"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c\") " Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940817 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wth2c\" (UniqueName: \"kubernetes.io/projected/c806e874-b77d-4da6-8608-cbfac8bde50a-kube-api-access-wth2c\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.940834 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c806e874-b77d-4da6-8608-cbfac8bde50a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.942457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" (UID: "a0cba5b6-ee0a-46d1-8a11-d3d841aa820c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.942567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" (UID: "e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.945373 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94d531c9-e092-442a-8c4b-044fcf12ac9e" (UID: "94d531c9-e092-442a-8c4b-044fcf12ac9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.945910 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn" (OuterVolumeSpecName: "kube-api-access-qzjxn") pod "94d531c9-e092-442a-8c4b-044fcf12ac9e" (UID: "94d531c9-e092-442a-8c4b-044fcf12ac9e"). InnerVolumeSpecName "kube-api-access-qzjxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.946072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5" (OuterVolumeSpecName: "kube-api-access-rr4n5") pod "e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" (UID: "e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46"). InnerVolumeSpecName "kube-api-access-rr4n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:22 crc kubenswrapper[4743]: I0122 14:03:22.947648 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5" (OuterVolumeSpecName: "kube-api-access-g2qt5") pod "a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" (UID: "a0cba5b6-ee0a-46d1-8a11-d3d841aa820c"). InnerVolumeSpecName "kube-api-access-g2qt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.014688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nm9l6"] Jan 22 14:03:23 crc kubenswrapper[4743]: W0122 14:03:23.016997 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9518ef3_f251_4bf9_b45d_0f93876b2e7c.slice/crio-840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98 WatchSource:0}: Error finding container 840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98: Status 404 returned error can't find the container with id 840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98 Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042470 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042821 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042838 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4n5\" (UniqueName: \"kubernetes.io/projected/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46-kube-api-access-rr4n5\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042849 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzjxn\" (UniqueName: \"kubernetes.io/projected/94d531c9-e092-442a-8c4b-044fcf12ac9e-kube-api-access-qzjxn\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042857 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2qt5\" (UniqueName: \"kubernetes.io/projected/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c-kube-api-access-g2qt5\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.042865 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d531c9-e092-442a-8c4b-044fcf12ac9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.182223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nm9l6" event={"ID":"d9518ef3-f251-4bf9-b45d-0f93876b2e7c","Type":"ContainerStarted","Data":"840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.184437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbz7j" event={"ID":"94d531c9-e092-442a-8c4b-044fcf12ac9e","Type":"ContainerDied","Data":"d5fadacbbc95f6390529ade25c2b9a467bd3e20104605eca4c560bfc03a44548"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.184486 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5fadacbbc95f6390529ade25c2b9a467bd3e20104605eca4c560bfc03a44548" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.184557 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbz7j" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.193657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pls6v" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.193686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pls6v" event={"ID":"a0cba5b6-ee0a-46d1-8a11-d3d841aa820c","Type":"ContainerDied","Data":"9dc9a61da3ba506fe53814cad3f9e90b22046971114d7c87d9fd822954d93431"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.193727 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc9a61da3ba506fe53814cad3f9e90b22046971114d7c87d9fd822954d93431" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.195047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5zstz" event={"ID":"e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46","Type":"ContainerDied","Data":"d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.195080 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01e30e5b932496584647926a27be342ce47c22ae1675c0e875cbf8a9b7866aa" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.195124 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5zstz" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.197556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8xtkr" event={"ID":"c806e874-b77d-4da6-8608-cbfac8bde50a","Type":"ContainerDied","Data":"034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.197830 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034095a808fb2017a9898f1fd7b1821a336fcf8c998700201de64b0f77abeb45" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.197905 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8xtkr" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.199326 4743 generic.go:334] "Generic (PLEG): container finished" podID="43283afa-8819-4f03-90e3-d4aa575dec5a" containerID="321cbe6ceb2075eae0b98bbf659f9b3c57f058cc02bfbe5e97506a5756b810b7" exitCode=0 Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.199420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e41f-account-create-update-4rrgn" event={"ID":"43283afa-8819-4f03-90e3-d4aa575dec5a","Type":"ContainerDied","Data":"321cbe6ceb2075eae0b98bbf659f9b3c57f058cc02bfbe5e97506a5756b810b7"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.199455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e41f-account-create-update-4rrgn" event={"ID":"43283afa-8819-4f03-90e3-d4aa575dec5a","Type":"ContainerStarted","Data":"0a2faf51d5009babd2b2c41f3dc93af618c1e1c30383a8f9025709e94cae89d5"} Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.200846 4743 generic.go:334] "Generic (PLEG): container finished" podID="a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" containerID="75dd059d200de49e276d1e6a2c754dcbc258ee37a2b0c93be0fd84aaa798fcf0" exitCode=0 Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.201062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-27mjs" event={"ID":"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd","Type":"ContainerDied","Data":"75dd059d200de49e276d1e6a2c754dcbc258ee37a2b0c93be0fd84aaa798fcf0"} Jan 22 14:03:23 crc kubenswrapper[4743]: E0122 14:03:23.428356 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.53:53530->38.102.83.53:40709: write tcp 38.102.83.53:53530->38.102.83.53:40709: write: broken pipe Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.615218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.659340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts\") pod \"668500b3-5335-4fd9-992b-0b0111284379\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.659495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrhx4\" (UniqueName: \"kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4\") pod \"668500b3-5335-4fd9-992b-0b0111284379\" (UID: \"668500b3-5335-4fd9-992b-0b0111284379\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.659947 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "668500b3-5335-4fd9-992b-0b0111284379" (UID: "668500b3-5335-4fd9-992b-0b0111284379"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.665968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4" (OuterVolumeSpecName: "kube-api-access-qrhx4") pod "668500b3-5335-4fd9-992b-0b0111284379" (UID: "668500b3-5335-4fd9-992b-0b0111284379"). InnerVolumeSpecName "kube-api-access-qrhx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.724168 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.729745 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760175 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts\") pod \"7a0a522c-1c7c-4b39-94bf-d741ea969082\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760241 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkqfz\" (UniqueName: \"kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz\") pod \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760312 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz5m9\" (UniqueName: \"kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9\") pod \"7a0a522c-1c7c-4b39-94bf-d741ea969082\" (UID: \"7a0a522c-1c7c-4b39-94bf-d741ea969082\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts\") pod \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\" (UID: \"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9\") " Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760635 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668500b3-5335-4fd9-992b-0b0111284379-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.760650 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrhx4\" (UniqueName: \"kubernetes.io/projected/668500b3-5335-4fd9-992b-0b0111284379-kube-api-access-qrhx4\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.762686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a0a522c-1c7c-4b39-94bf-d741ea969082" (UID: "7a0a522c-1c7c-4b39-94bf-d741ea969082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.765397 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" (UID: "dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.766547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9" (OuterVolumeSpecName: "kube-api-access-lz5m9") pod "7a0a522c-1c7c-4b39-94bf-d741ea969082" (UID: "7a0a522c-1c7c-4b39-94bf-d741ea969082"). InnerVolumeSpecName "kube-api-access-lz5m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.772171 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz" (OuterVolumeSpecName: "kube-api-access-vkqfz") pod "dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" (UID: "dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9"). InnerVolumeSpecName "kube-api-access-vkqfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.862847 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz5m9\" (UniqueName: \"kubernetes.io/projected/7a0a522c-1c7c-4b39-94bf-d741ea969082-kube-api-access-lz5m9\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.862881 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.862891 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a0a522c-1c7c-4b39-94bf-d741ea969082-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:23 crc kubenswrapper[4743]: I0122 14:03:23.862900 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkqfz\" (UniqueName: \"kubernetes.io/projected/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9-kube-api-access-vkqfz\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.215237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a292-account-create-update-zfn7t" event={"ID":"668500b3-5335-4fd9-992b-0b0111284379","Type":"ContainerDied","Data":"6c0872e4ba16df77acb8685b1155c9de7ef4bf9af8daf12ba131fb940010f440"} Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.215552 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0872e4ba16df77acb8685b1155c9de7ef4bf9af8daf12ba131fb940010f440" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.215279 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a292-account-create-update-zfn7t" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.216835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12cb-account-create-update-c67vm" event={"ID":"dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9","Type":"ContainerDied","Data":"54aefc434dd69c4eee97f4e25a1c1b16b77611dbfe3cffd3f257e6cdb298dc2b"} Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.216860 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54aefc434dd69c4eee97f4e25a1c1b16b77611dbfe3cffd3f257e6cdb298dc2b" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.216878 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12cb-account-create-update-c67vm" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.218900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1f94-account-create-update-2br2j" event={"ID":"7a0a522c-1c7c-4b39-94bf-d741ea969082","Type":"ContainerDied","Data":"4b43d837ec918a565c002fde7d4c157eb2fd6ea6cc20bf99a1f380ae7baa9657"} Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.218926 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b43d837ec918a565c002fde7d4c157eb2fd6ea6cc20bf99a1f380ae7baa9657" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.219134 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1f94-account-create-update-2br2j" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.615320 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.630116 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.776511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b64g\" (UniqueName: \"kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g\") pod \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.776613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts\") pod \"43283afa-8819-4f03-90e3-d4aa575dec5a\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.776648 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjwrt\" (UniqueName: \"kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt\") pod \"43283afa-8819-4f03-90e3-d4aa575dec5a\" (UID: \"43283afa-8819-4f03-90e3-d4aa575dec5a\") " Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.776724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts\") pod \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\" (UID: \"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd\") " Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.778142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" (UID: "a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.778228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43283afa-8819-4f03-90e3-d4aa575dec5a" (UID: "43283afa-8819-4f03-90e3-d4aa575dec5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.800417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g" (OuterVolumeSpecName: "kube-api-access-8b64g") pod "a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" (UID: "a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd"). InnerVolumeSpecName "kube-api-access-8b64g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.800549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt" (OuterVolumeSpecName: "kube-api-access-gjwrt") pod "43283afa-8819-4f03-90e3-d4aa575dec5a" (UID: "43283afa-8819-4f03-90e3-d4aa575dec5a"). InnerVolumeSpecName "kube-api-access-gjwrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.879145 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b64g\" (UniqueName: \"kubernetes.io/projected/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-kube-api-access-8b64g\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.879485 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43283afa-8819-4f03-90e3-d4aa575dec5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.879500 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjwrt\" (UniqueName: \"kubernetes.io/projected/43283afa-8819-4f03-90e3-d4aa575dec5a-kube-api-access-gjwrt\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:24 crc kubenswrapper[4743]: I0122 14:03:24.879513 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.231132 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e41f-account-create-update-4rrgn" Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.231108 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e41f-account-create-update-4rrgn" event={"ID":"43283afa-8819-4f03-90e3-d4aa575dec5a","Type":"ContainerDied","Data":"0a2faf51d5009babd2b2c41f3dc93af618c1e1c30383a8f9025709e94cae89d5"} Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.231283 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a2faf51d5009babd2b2c41f3dc93af618c1e1c30383a8f9025709e94cae89d5" Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.234024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-27mjs" event={"ID":"a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd","Type":"ContainerDied","Data":"91b4a2711c567af8944b47b7c6917113d1ee2ca59f4185d765ad034e49114081"} Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.234058 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b4a2711c567af8944b47b7c6917113d1ee2ca59f4185d765ad034e49114081" Jan 22 14:03:25 crc kubenswrapper[4743]: I0122 14:03:25.234133 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-27mjs" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649122 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hvtzl"] Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649454 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d531c9-e092-442a-8c4b-044fcf12ac9e" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d531c9-e092-442a-8c4b-044fcf12ac9e" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649479 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649486 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649495 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0a522c-1c7c-4b39-94bf-d741ea969082" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649501 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0a522c-1c7c-4b39-94bf-d741ea969082" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649508 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649514 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649523 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c806e874-b77d-4da6-8608-cbfac8bde50a" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649529 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c806e874-b77d-4da6-8608-cbfac8bde50a" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649540 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43283afa-8819-4f03-90e3-d4aa575dec5a" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649545 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="43283afa-8819-4f03-90e3-d4aa575dec5a" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649554 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668500b3-5335-4fd9-992b-0b0111284379" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649560 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="668500b3-5335-4fd9-992b-0b0111284379" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649569 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649574 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: E0122 14:03:26.649590 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649596 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649858 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c806e874-b77d-4da6-8608-cbfac8bde50a" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649875 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649882 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d531c9-e092-442a-8c4b-044fcf12ac9e" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649892 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649902 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" containerName="mariadb-database-create" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649911 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="43283afa-8819-4f03-90e3-d4aa575dec5a" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649918 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0a522c-1c7c-4b39-94bf-d741ea969082" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649927 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.649936 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="668500b3-5335-4fd9-992b-0b0111284379" containerName="mariadb-account-create-update" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.650430 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.655357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.655437 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.655489 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-48lf7" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.655510 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.665628 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hvtzl"] Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.811187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6nd9\" (UniqueName: \"kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.811556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.811595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.913396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.913447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.913576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6nd9\" (UniqueName: \"kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.919049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.921419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.929858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6nd9\" (UniqueName: \"kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9\") pod \"keystone-db-sync-hvtzl\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:26 crc kubenswrapper[4743]: I0122 14:03:26.982627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.055954 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m22h5" podUID="f3551792-b862-492e-8c36-e0a63cd4468f" containerName="ovn-controller" probeResult="failure" output=< Jan 22 14:03:27 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 22 14:03:27 crc kubenswrapper[4743]: > Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.173287 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.179986 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rmfgh" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.415200 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m22h5-config-mzvrv"] Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.416318 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.433646 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5-config-mzvrv"] Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.433994 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9x8\" (UniqueName: \"kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455834 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.455987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.536167 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hvtzl"] Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.558675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9x8\" (UniqueName: \"kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.558857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.558991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559275 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559445 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.559748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.562454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.563840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.581950 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9x8\" (UniqueName: \"kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8\") pod \"ovn-controller-m22h5-config-mzvrv\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:27 crc kubenswrapper[4743]: I0122 14:03:27.750522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:28 crc kubenswrapper[4743]: I0122 14:03:28.206233 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5-config-mzvrv"] Jan 22 14:03:28 crc kubenswrapper[4743]: W0122 14:03:28.211450 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod088a2be6_5ba4_4104_b734_30b0931fd1b8.slice/crio-ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda WatchSource:0}: Error finding container ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda: Status 404 returned error can't find the container with id ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda Jan 22 14:03:28 crc kubenswrapper[4743]: I0122 14:03:28.274730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hvtzl" event={"ID":"c13edc29-cd06-4113-8366-75a41988c89f","Type":"ContainerStarted","Data":"0dc8614117f45e1c332eba11a05b8ecc9733c878eb7e1673c518cd9ec551c169"} Jan 22 14:03:28 crc kubenswrapper[4743]: I0122 14:03:28.276543 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-mzvrv" event={"ID":"088a2be6-5ba4-4104-b734-30b0931fd1b8","Type":"ContainerStarted","Data":"ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda"} Jan 22 14:03:28 crc kubenswrapper[4743]: I0122 14:03:28.554475 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 22 14:03:29 crc kubenswrapper[4743]: I0122 14:03:29.287982 4743 generic.go:334] "Generic (PLEG): container finished" podID="56dff5fb-e22c-4045-b3c4-c75e018df046" containerID="3a8590dd76f4486817876fbdd0d018ab916f47d8d9fa6c9e0ea2631339a07c71" exitCode=0 Jan 22 14:03:29 crc kubenswrapper[4743]: I0122 14:03:29.288065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kg2fn" event={"ID":"56dff5fb-e22c-4045-b3c4-c75e018df046","Type":"ContainerDied","Data":"3a8590dd76f4486817876fbdd0d018ab916f47d8d9fa6c9e0ea2631339a07c71"} Jan 22 14:03:29 crc kubenswrapper[4743]: I0122 14:03:29.290906 4743 generic.go:334] "Generic (PLEG): container finished" podID="088a2be6-5ba4-4104-b734-30b0931fd1b8" containerID="acbdd7bbabf7d69152e3949120befe9f3d15d60dd174301ea887d2e38915619f" exitCode=0 Jan 22 14:03:29 crc kubenswrapper[4743]: I0122 14:03:29.290953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-mzvrv" event={"ID":"088a2be6-5ba4-4104-b734-30b0931fd1b8","Type":"ContainerDied","Data":"acbdd7bbabf7d69152e3949120befe9f3d15d60dd174301ea887d2e38915619f"} Jan 22 14:03:30 crc kubenswrapper[4743]: I0122 14:03:30.049510 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:03:30 crc kubenswrapper[4743]: I0122 14:03:30.049938 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:03:32 crc kubenswrapper[4743]: I0122 14:03:32.046640 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m22h5" Jan 22 14:03:37 crc kubenswrapper[4743]: I0122 14:03:37.052662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:37 crc kubenswrapper[4743]: I0122 14:03:37.060826 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/338e196f-7c64-4cbd-b058-768ccb4c5df9-etc-swift\") pod \"swift-storage-0\" (UID: \"338e196f-7c64-4cbd-b058-768ccb4c5df9\") " pod="openstack/swift-storage-0" Jan 22 14:03:37 crc kubenswrapper[4743]: I0122 14:03:37.341381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 22 14:03:37 crc kubenswrapper[4743]: I0122 14:03:37.971943 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:37 crc kubenswrapper[4743]: I0122 14:03:37.985561 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.067651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.067745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7gf2\" (UniqueName: \"kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.067809 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.067908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.067976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068179 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9x8\" (UniqueName: \"kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068237 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts\") pod \"088a2be6-5ba4-4104-b734-30b0931fd1b8\" (UID: \"088a2be6-5ba4-4104-b734-30b0931fd1b8\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.068309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle\") pod \"56dff5fb-e22c-4045-b3c4-c75e018df046\" (UID: \"56dff5fb-e22c-4045-b3c4-c75e018df046\") " Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.073324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.073553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.074299 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.074411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.074877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run" (OuterVolumeSpecName: "var-run") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.075143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.075252 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts" (OuterVolumeSpecName: "scripts") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.077107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8" (OuterVolumeSpecName: "kube-api-access-zf9x8") pod "088a2be6-5ba4-4104-b734-30b0931fd1b8" (UID: "088a2be6-5ba4-4104-b734-30b0931fd1b8"). InnerVolumeSpecName "kube-api-access-zf9x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.079702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2" (OuterVolumeSpecName: "kube-api-access-q7gf2") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "kube-api-access-q7gf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.094882 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts" (OuterVolumeSpecName: "scripts") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.096018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.096940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.097469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "56dff5fb-e22c-4045-b3c4-c75e018df046" (UID: "56dff5fb-e22c-4045-b3c4-c75e018df046"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170344 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170374 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170385 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170393 4743 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170403 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9x8\" (UniqueName: \"kubernetes.io/projected/088a2be6-5ba4-4104-b734-30b0931fd1b8-kube-api-access-zf9x8\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170412 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/088a2be6-5ba4-4104-b734-30b0931fd1b8-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170423 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170431 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170440 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56dff5fb-e22c-4045-b3c4-c75e018df046-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170448 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7gf2\" (UniqueName: \"kubernetes.io/projected/56dff5fb-e22c-4045-b3c4-c75e018df046-kube-api-access-q7gf2\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170456 4743 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56dff5fb-e22c-4045-b3c4-c75e018df046-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170464 4743 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56dff5fb-e22c-4045-b3c4-c75e018df046-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.170471 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/088a2be6-5ba4-4104-b734-30b0931fd1b8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.266003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 22 14:03:38 crc kubenswrapper[4743]: W0122 14:03:38.281988 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod338e196f_7c64_4cbd_b058_768ccb4c5df9.slice/crio-cfdf6390a8bb0761f62ed032c053bfffd9aae16961e82d9d3d9cb94a760f56c7 WatchSource:0}: Error finding container cfdf6390a8bb0761f62ed032c053bfffd9aae16961e82d9d3d9cb94a760f56c7: Status 404 returned error can't find the container with id cfdf6390a8bb0761f62ed032c053bfffd9aae16961e82d9d3d9cb94a760f56c7 Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.368291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"cfdf6390a8bb0761f62ed032c053bfffd9aae16961e82d9d3d9cb94a760f56c7"} Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.370374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kg2fn" event={"ID":"56dff5fb-e22c-4045-b3c4-c75e018df046","Type":"ContainerDied","Data":"355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4"} Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.370421 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="355253da609a1e81d50b12369dfdb48c9a6a5f1bb77eb3c966e33192cb80b4b4" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.370484 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kg2fn" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.378694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hvtzl" event={"ID":"c13edc29-cd06-4113-8366-75a41988c89f","Type":"ContainerStarted","Data":"b43faa67d40c573610da27f85b959e7e2eeb51117c4b36258fbbc29858855c62"} Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.382453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-mzvrv" event={"ID":"088a2be6-5ba4-4104-b734-30b0931fd1b8","Type":"ContainerDied","Data":"ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda"} Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.382498 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-mzvrv" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.382501 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1f593f1d314cc8be1d95332a3f5aa96d39c2bc761648c3ca54e313be188bda" Jan 22 14:03:38 crc kubenswrapper[4743]: I0122 14:03:38.404571 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hvtzl" podStartSLOduration=2.184598704 podStartE2EDuration="12.404553661s" podCreationTimestamp="2026-01-22 14:03:26 +0000 UTC" firstStartedPulling="2026-01-22 14:03:27.554770376 +0000 UTC m=+1044.109813539" lastFinishedPulling="2026-01-22 14:03:37.774725323 +0000 UTC m=+1054.329768496" observedRunningTime="2026-01-22 14:03:38.396178224 +0000 UTC m=+1054.951221397" watchObservedRunningTime="2026-01-22 14:03:38.404553661 +0000 UTC m=+1054.959596824" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.108852 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m22h5-config-mzvrv"] Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.117700 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m22h5-config-mzvrv"] Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.206019 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m22h5-config-wcp8d"] Jan 22 14:03:39 crc kubenswrapper[4743]: E0122 14:03:39.206484 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dff5fb-e22c-4045-b3c4-c75e018df046" containerName="swift-ring-rebalance" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.206512 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dff5fb-e22c-4045-b3c4-c75e018df046" containerName="swift-ring-rebalance" Jan 22 14:03:39 crc kubenswrapper[4743]: E0122 14:03:39.206529 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088a2be6-5ba4-4104-b734-30b0931fd1b8" containerName="ovn-config" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.206537 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="088a2be6-5ba4-4104-b734-30b0931fd1b8" containerName="ovn-config" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.206744 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="56dff5fb-e22c-4045-b3c4-c75e018df046" containerName="swift-ring-rebalance" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.206806 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="088a2be6-5ba4-4104-b734-30b0931fd1b8" containerName="ovn-config" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.207502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.212058 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.223491 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5-config-wcp8d"] Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.304999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.305092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.305122 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltnt\" (UniqueName: \"kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.305142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.305189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.305221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.398763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nm9l6" event={"ID":"d9518ef3-f251-4bf9-b45d-0f93876b2e7c","Type":"ContainerStarted","Data":"2c1755ff78d5f28f816a75363ac12b8522205710fd499d6b66a4f7c2a53a0f2c"} Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.406909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltnt\" (UniqueName: \"kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.407388 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.408346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.409395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.426748 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nm9l6" podStartSLOduration=3.628439198 podStartE2EDuration="18.426727481s" podCreationTimestamp="2026-01-22 14:03:21 +0000 UTC" firstStartedPulling="2026-01-22 14:03:23.019232967 +0000 UTC m=+1039.574276130" lastFinishedPulling="2026-01-22 14:03:37.81752125 +0000 UTC m=+1054.372564413" observedRunningTime="2026-01-22 14:03:39.41965475 +0000 UTC m=+1055.974697923" watchObservedRunningTime="2026-01-22 14:03:39.426727481 +0000 UTC m=+1055.981770644" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.444718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltnt\" (UniqueName: \"kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt\") pod \"ovn-controller-m22h5-config-wcp8d\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.530608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:39 crc kubenswrapper[4743]: I0122 14:03:39.762895 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088a2be6-5ba4-4104-b734-30b0931fd1b8" path="/var/lib/kubelet/pods/088a2be6-5ba4-4104-b734-30b0931fd1b8/volumes" Jan 22 14:03:40 crc kubenswrapper[4743]: I0122 14:03:40.176807 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m22h5-config-wcp8d"] Jan 22 14:03:40 crc kubenswrapper[4743]: W0122 14:03:40.180612 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod337aee61_0e0b_4934_ba37_1d48a94633da.slice/crio-76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc WatchSource:0}: Error finding container 76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc: Status 404 returned error can't find the container with id 76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc Jan 22 14:03:40 crc kubenswrapper[4743]: I0122 14:03:40.416712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-wcp8d" event={"ID":"337aee61-0e0b-4934-ba37-1d48a94633da","Type":"ContainerStarted","Data":"76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc"} Jan 22 14:03:40 crc kubenswrapper[4743]: I0122 14:03:40.418717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"a6ae09ce0db325b9a7a792eb4171da31c15f6a771c06566776aaa1a2c7bcba10"} Jan 22 14:03:40 crc kubenswrapper[4743]: I0122 14:03:40.418828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"c40b5a572e07f2a1c7d53b2735b3f64d5cdf2ed6c657585059e0c097e32941af"} Jan 22 14:03:41 crc kubenswrapper[4743]: I0122 14:03:41.428236 4743 generic.go:334] "Generic (PLEG): container finished" podID="337aee61-0e0b-4934-ba37-1d48a94633da" containerID="9ab45a420601c4e026d9738a82fe91d9d9e16553b71d6ce00c3644df613bbad9" exitCode=0 Jan 22 14:03:41 crc kubenswrapper[4743]: I0122 14:03:41.428302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-wcp8d" event={"ID":"337aee61-0e0b-4934-ba37-1d48a94633da","Type":"ContainerDied","Data":"9ab45a420601c4e026d9738a82fe91d9d9e16553b71d6ce00c3644df613bbad9"} Jan 22 14:03:41 crc kubenswrapper[4743]: I0122 14:03:41.432138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"a42a1372c6dd195731803374f2bb2afb7c99d9b67dd2770d2cac1ce454c122de"} Jan 22 14:03:41 crc kubenswrapper[4743]: I0122 14:03:41.432203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"e68bc9f88d65a150543f162cd6bcf212408831bfaec7b6187d7cb5201f1e782d"} Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.928001 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.981988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982049 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kltnt\" (UniqueName: \"kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn\") pod \"337aee61-0e0b-4934-ba37-1d48a94633da\" (UID: \"337aee61-0e0b-4934-ba37-1d48a94633da\") " Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run" (OuterVolumeSpecName: "var-run") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982908 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982925 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.982935 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/337aee61-0e0b-4934-ba37-1d48a94633da-var-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.983270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.983431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts" (OuterVolumeSpecName: "scripts") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:42.995155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt" (OuterVolumeSpecName: "kube-api-access-kltnt") pod "337aee61-0e0b-4934-ba37-1d48a94633da" (UID: "337aee61-0e0b-4934-ba37-1d48a94633da"). InnerVolumeSpecName "kube-api-access-kltnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.084683 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.084710 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/337aee61-0e0b-4934-ba37-1d48a94633da-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.084723 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kltnt\" (UniqueName: \"kubernetes.io/projected/337aee61-0e0b-4934-ba37-1d48a94633da-kube-api-access-kltnt\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.450352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m22h5-config-wcp8d" event={"ID":"337aee61-0e0b-4934-ba37-1d48a94633da","Type":"ContainerDied","Data":"76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc"} Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.450388 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76e93847b50de25aa4b2fafbde8366ad25c579e8c400b79918e25232675c22fc" Jan 22 14:03:43 crc kubenswrapper[4743]: I0122 14:03:43.450443 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m22h5-config-wcp8d" Jan 22 14:03:44 crc kubenswrapper[4743]: I0122 14:03:44.014250 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m22h5-config-wcp8d"] Jan 22 14:03:44 crc kubenswrapper[4743]: I0122 14:03:44.022601 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m22h5-config-wcp8d"] Jan 22 14:03:44 crc kubenswrapper[4743]: I0122 14:03:44.462590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"b3a27ac6762c702425930dc95047ae90984e3bbe66481e30ce9244b14d8664ab"} Jan 22 14:03:44 crc kubenswrapper[4743]: I0122 14:03:44.462640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"05e89092033aafe1cf9205680837ad1ca358f584219be67513b1df2e621a967f"} Jan 22 14:03:45 crc kubenswrapper[4743]: I0122 14:03:45.476222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"8549dbecce0d62280cfa9c2bdc03818ed4947ee2e80eb3117e532fbc7b11d16d"} Jan 22 14:03:45 crc kubenswrapper[4743]: I0122 14:03:45.764658 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337aee61-0e0b-4934-ba37-1d48a94633da" path="/var/lib/kubelet/pods/337aee61-0e0b-4934-ba37-1d48a94633da/volumes" Jan 22 14:03:46 crc kubenswrapper[4743]: I0122 14:03:46.486126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"dea792abb1343a7d23689ee8648ad71ac28da7754dbf46389f2508e8e81ad302"} Jan 22 14:03:48 crc kubenswrapper[4743]: I0122 14:03:48.514150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"dd84db33d3723415daaff3c5c1615e94c27b2ab204268a76afb030d1eff5312d"} Jan 22 14:03:48 crc kubenswrapper[4743]: I0122 14:03:48.518130 4743 generic.go:334] "Generic (PLEG): container finished" podID="c13edc29-cd06-4113-8366-75a41988c89f" containerID="b43faa67d40c573610da27f85b959e7e2eeb51117c4b36258fbbc29858855c62" exitCode=0 Jan 22 14:03:48 crc kubenswrapper[4743]: I0122 14:03:48.518175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hvtzl" event={"ID":"c13edc29-cd06-4113-8366-75a41988c89f","Type":"ContainerDied","Data":"b43faa67d40c573610da27f85b959e7e2eeb51117c4b36258fbbc29858855c62"} Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.532336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"672c985f88cec348955a9341ff2ce44fb5e1ee5f03c8df97cca595dc0cd34785"} Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.532759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"100eebe594f366ced5c87996c8cebaf94aab9494002f9855613f0838f376d416"} Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.532771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"fd6458d1fd4c2aae7d622c5a886b57979e04db6d7b33762d6d4fa74eedbb9d8f"} Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.822949 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.907677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data\") pod \"c13edc29-cd06-4113-8366-75a41988c89f\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.907742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6nd9\" (UniqueName: \"kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9\") pod \"c13edc29-cd06-4113-8366-75a41988c89f\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.907847 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle\") pod \"c13edc29-cd06-4113-8366-75a41988c89f\" (UID: \"c13edc29-cd06-4113-8366-75a41988c89f\") " Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.924031 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9" (OuterVolumeSpecName: "kube-api-access-d6nd9") pod "c13edc29-cd06-4113-8366-75a41988c89f" (UID: "c13edc29-cd06-4113-8366-75a41988c89f"). InnerVolumeSpecName "kube-api-access-d6nd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.931919 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13edc29-cd06-4113-8366-75a41988c89f" (UID: "c13edc29-cd06-4113-8366-75a41988c89f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:03:49 crc kubenswrapper[4743]: I0122 14:03:49.961112 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data" (OuterVolumeSpecName: "config-data") pod "c13edc29-cd06-4113-8366-75a41988c89f" (UID: "c13edc29-cd06-4113-8366-75a41988c89f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.009641 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.009684 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6nd9\" (UniqueName: \"kubernetes.io/projected/c13edc29-cd06-4113-8366-75a41988c89f-kube-api-access-d6nd9\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.009699 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13edc29-cd06-4113-8366-75a41988c89f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.547987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"020f1e189d0f51c2921e2318990f95751b02c740913f00899da7fe8327b23d59"} Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.551131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hvtzl" event={"ID":"c13edc29-cd06-4113-8366-75a41988c89f","Type":"ContainerDied","Data":"0dc8614117f45e1c332eba11a05b8ecc9733c878eb7e1673c518cd9ec551c169"} Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.551178 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hvtzl" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.551180 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc8614117f45e1c332eba11a05b8ecc9733c878eb7e1673c518cd9ec551c169" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.841771 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gv54v"] Jan 22 14:03:50 crc kubenswrapper[4743]: E0122 14:03:50.842199 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13edc29-cd06-4113-8366-75a41988c89f" containerName="keystone-db-sync" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.842214 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13edc29-cd06-4113-8366-75a41988c89f" containerName="keystone-db-sync" Jan 22 14:03:50 crc kubenswrapper[4743]: E0122 14:03:50.842235 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337aee61-0e0b-4934-ba37-1d48a94633da" containerName="ovn-config" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.842242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="337aee61-0e0b-4934-ba37-1d48a94633da" containerName="ovn-config" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.842398 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13edc29-cd06-4113-8366-75a41988c89f" containerName="keystone-db-sync" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.842415 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="337aee61-0e0b-4934-ba37-1d48a94633da" containerName="ovn-config" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.842958 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.847201 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.847436 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-48lf7" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.847568 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.847821 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.848022 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.858193 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gv54v"] Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.874845 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.876144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.928681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jrqr\" (UniqueName: \"kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxns4\" (UniqueName: \"kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:50 crc kubenswrapper[4743]: I0122 14:03:50.936486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.031641 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.033536 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.037059 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-l92hz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.037436 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.037701 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.038114 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.041900 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.041941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jrqr\" (UniqueName: \"kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.041982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxns4\" (UniqueName: \"kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.042185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.043529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.043555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.044006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.044324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.059280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.061978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.066710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.072112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.073368 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.075337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxns4\" (UniqueName: \"kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.082089 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.085507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.094455 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.095490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data\") pod \"keystone-bootstrap-gv54v\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.095820 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.104806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jrqr\" (UniqueName: \"kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr\") pod \"dnsmasq-dns-f877ddd87-kd44g\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.145249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146200 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm42k\" (UniqueName: \"kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh27f\" (UniqueName: \"kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.146438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.161347 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fqwwj"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.171444 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.171688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.177858 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k87rb" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.177966 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.178068 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.181932 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-srjxw"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.183050 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.186544 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.186938 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wzr2p" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.187041 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.211618 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fqwwj"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.211974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.249863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.249930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.249970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.249989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5l2p\" (UniqueName: \"kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgm5q\" (UniqueName: \"kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250158 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm42k\" (UniqueName: \"kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh27f\" (UniqueName: \"kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250333 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.250753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.251377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.251423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.252147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.258280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.259433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.266688 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.266863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.280295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.284902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.285497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm42k\" (UniqueName: \"kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k\") pod \"ceilometer-0\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.300570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh27f\" (UniqueName: \"kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f\") pod \"horizon-8d75d5cbf-4kt4n\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.304312 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-srjxw"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.356838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5l2p\" (UniqueName: \"kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgm5q\" (UniqueName: \"kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.357635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.366550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.373180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.374006 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.376570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.393405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.398601 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.401480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.402126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgm5q\" (UniqueName: \"kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.402610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5l2p\" (UniqueName: \"kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p\") pod \"neutron-db-sync-fqwwj\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.412934 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.414680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle\") pod \"cinder-db-sync-srjxw\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.427553 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tcdjz"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.428891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.431500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.431700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tcdjz"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.431914 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vkgkb" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.440443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.440996 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9t996"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.442236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.443707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.444119 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.444225 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pm9nn" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.460199 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.462946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.494771 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9t996"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.507944 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-srjxw" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.516225 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.518887 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.570183 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssfqb\" (UniqueName: \"kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98km\" (UniqueName: \"kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573775 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.573955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw4t2\" (UniqueName: \"kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.576216 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.609224 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw4t2\" (UniqueName: \"kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssfqb\" (UniqueName: \"kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676213 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnjlb\" (UniqueName: \"kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.676333 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98km\" (UniqueName: \"kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.677937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.678417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.678894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.679430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.681864 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.691517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.698675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw4t2\" (UniqueName: \"kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.702471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.714345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssfqb\" (UniqueName: \"kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb\") pod \"barbican-db-sync-tcdjz\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.714588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.715381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98km\" (UniqueName: \"kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km\") pod \"dnsmasq-dns-68dcc9cf6f-qhp4l\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.715902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle\") pod \"placement-db-sync-9t996\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.779272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnjlb\" (UniqueName: \"kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.779600 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.779665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.779684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.779771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.780464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.781192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.782049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.790600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.807133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnjlb\" (UniqueName: \"kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb\") pod \"horizon-7557f5c46c-q4pfj\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.836339 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.897362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9t996" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.943178 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.943548 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.951007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.962983 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gv54v"] Jan 22 14:03:51 crc kubenswrapper[4743]: W0122 14:03:51.964329 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18fb1ec_4b3a_4369_824c_56d5512b2cb4.slice/crio-08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c WatchSource:0}: Error finding container 08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c: Status 404 returned error can't find the container with id 08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c Jan 22 14:03:51 crc kubenswrapper[4743]: I0122 14:03:51.970275 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.242543 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.251219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-srjxw"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.355684 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fqwwj"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.537837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9t996"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.569177 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tcdjz"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.601194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d75d5cbf-4kt4n" event={"ID":"c713d1a6-b56c-4179-8052-619946111c93","Type":"ContainerStarted","Data":"4f291bcf7d8465275cfcc25b42255fcf7c6422fe80b93eda1a016592df3cebfe"} Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.613721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gv54v" event={"ID":"b18fb1ec-4b3a-4369-824c-56d5512b2cb4","Type":"ContainerStarted","Data":"08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c"} Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.614858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" event={"ID":"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5","Type":"ContainerStarted","Data":"2b745f07cfdd7a0d06d03f8501ab49d0efe48ae888c5e03b58fc1d55701b9f34"} Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.713820 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.734594 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.874317 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.923865 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.925264 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.942863 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:03:52 crc kubenswrapper[4743]: I0122 14:03:52.955844 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.015147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.015203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.015251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bln4\" (UniqueName: \"kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.015269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.015324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.116589 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.116712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.116770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.116932 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bln4\" (UniqueName: \"kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.116964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.117937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.118480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.120026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.125281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.139605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bln4\" (UniqueName: \"kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4\") pod \"horizon-fc55bf6d5-bhvcx\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:53 crc kubenswrapper[4743]: I0122 14:03:53.244507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:03:54 crc kubenswrapper[4743]: W0122 14:03:54.904967 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb22345c_594c_46a3_b362_e34baa8f271c.slice/crio-9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28 WatchSource:0}: Error finding container 9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28: Status 404 returned error can't find the container with id 9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28 Jan 22 14:03:54 crc kubenswrapper[4743]: W0122 14:03:54.911549 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8bd2850_37f2_40c9_aeb5_365158ca9716.slice/crio-7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37 WatchSource:0}: Error finding container 7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37: Status 404 returned error can't find the container with id 7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37 Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.402693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:03:55 crc kubenswrapper[4743]: W0122 14:03:55.410242 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bf9dca7_5e43_4ae7_abf8_36b0392b0700.slice/crio-57161db4b37969fe16692534f505a364e4ed74e3fa9e40d4f3ca5cc5a3083579 WatchSource:0}: Error finding container 57161db4b37969fe16692534f505a364e4ed74e3fa9e40d4f3ca5cc5a3083579: Status 404 returned error can't find the container with id 57161db4b37969fe16692534f505a364e4ed74e3fa9e40d4f3ca5cc5a3083579 Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.640454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" event={"ID":"f7ee9164-0b2f-41d4-81f4-117acae13511","Type":"ContainerStarted","Data":"83d23f0111a1345c2c4f9a113d7d1dc637451daa8ceabdf901e9977c6bfe87b6"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.642424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc55bf6d5-bhvcx" event={"ID":"9bf9dca7-5e43-4ae7-abf8-36b0392b0700","Type":"ContainerStarted","Data":"57161db4b37969fe16692534f505a364e4ed74e3fa9e40d4f3ca5cc5a3083579"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.643710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tcdjz" event={"ID":"846c118f-23c1-402f-8747-633485e743c9","Type":"ContainerStarted","Data":"3013ff8fd45403382013effbbc5cbdacc91176ab9a4560fce316634a57528b1f"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.644748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7557f5c46c-q4pfj" event={"ID":"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5","Type":"ContainerStarted","Data":"4147acfcb775588f97c2afe81156e407bd40f4646bbc28df4301e009b0be2b5d"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.646026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerStarted","Data":"f1a83ffb81d64069d982879b339bbda55df7ee602f62806aba4fd32f65c6f9a3"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.647087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-srjxw" event={"ID":"eb22345c-594c-46a3-b362-e34baa8f271c","Type":"ContainerStarted","Data":"9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.648177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9t996" event={"ID":"4ebac6d9-df0f-41fe-bc73-8236847ff237","Type":"ContainerStarted","Data":"6f328699c738936f75afcf4eba4fe74fde91cefdb86f53da239444c49f8893a5"} Jan 22 14:03:55 crc kubenswrapper[4743]: I0122 14:03:55.649310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fqwwj" event={"ID":"b8bd2850-37f2-40c9-aeb5-365158ca9716","Type":"ContainerStarted","Data":"7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.667127 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerID="06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541" exitCode=0 Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.667174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" event={"ID":"f7ee9164-0b2f-41d4-81f4-117acae13511","Type":"ContainerDied","Data":"06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.682880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"d3cc5c2fd60f82ee7ca0b16468d167300c207f12b52ffa16e3457531c035396c"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.682962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"338e196f-7c64-4cbd-b058-768ccb4c5df9","Type":"ContainerStarted","Data":"658fe10baedcafc1c1361da7b6df11ae76c81c2d836a9732d99223e9ad68578a"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.696300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gv54v" event={"ID":"b18fb1ec-4b3a-4369-824c-56d5512b2cb4","Type":"ContainerStarted","Data":"46eea21368dd54b81c103d7cd5b77b39db2dd3d45d5f81b1cb894a0b3d6ab5ba"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.700890 4743 generic.go:334] "Generic (PLEG): container finished" podID="962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" containerID="56cb8b2309a96d4d645a73069305b6df9fddafc8ea016a9387b29ad49449e099" exitCode=0 Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.701171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" event={"ID":"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5","Type":"ContainerDied","Data":"56cb8b2309a96d4d645a73069305b6df9fddafc8ea016a9387b29ad49449e099"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.730283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fqwwj" event={"ID":"b8bd2850-37f2-40c9-aeb5-365158ca9716","Type":"ContainerStarted","Data":"37b8e4844c7a5a524a68d32dc8c01f0e84365b3babaf0e9d46244f75aa6b4152"} Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.734539 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.809440054 podStartE2EDuration="52.734512554s" podCreationTimestamp="2026-01-22 14:03:04 +0000 UTC" firstStartedPulling="2026-01-22 14:03:38.284378084 +0000 UTC m=+1054.839421237" lastFinishedPulling="2026-01-22 14:03:48.209450574 +0000 UTC m=+1064.764493737" observedRunningTime="2026-01-22 14:03:56.729973101 +0000 UTC m=+1073.285016274" watchObservedRunningTime="2026-01-22 14:03:56.734512554 +0000 UTC m=+1073.289555717" Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.761333 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gv54v" podStartSLOduration=6.761312388 podStartE2EDuration="6.761312388s" podCreationTimestamp="2026-01-22 14:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:56.75697307 +0000 UTC m=+1073.312016233" watchObservedRunningTime="2026-01-22 14:03:56.761312388 +0000 UTC m=+1073.316355551" Jan 22 14:03:56 crc kubenswrapper[4743]: I0122 14:03:56.812022 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fqwwj" podStartSLOduration=5.812002597 podStartE2EDuration="5.812002597s" podCreationTimestamp="2026-01-22 14:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:56.804465944 +0000 UTC m=+1073.359509107" watchObservedRunningTime="2026-01-22 14:03:56.812002597 +0000 UTC m=+1073.367045760" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.010940 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.051894 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.053563 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.058523 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.061568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvtr\" (UniqueName: \"kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.087757 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192725 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvtr\" (UniqueName: \"kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192949 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.192973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.193955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.194006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.194710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.195287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.196111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.213699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvtr\" (UniqueName: \"kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr\") pod \"dnsmasq-dns-58dd9ff6bc-cc882\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.296996 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.304056 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.395408 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config\") pod \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.395450 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc\") pod \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.395473 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb\") pod \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.395601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jrqr\" (UniqueName: \"kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr\") pod \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.395878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb\") pod \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\" (UID: \"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5\") " Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.400840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr" (OuterVolumeSpecName: "kube-api-access-8jrqr") pod "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" (UID: "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5"). InnerVolumeSpecName "kube-api-access-8jrqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.421202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config" (OuterVolumeSpecName: "config") pod "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" (UID: "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.424617 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" (UID: "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.425495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" (UID: "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.428223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" (UID: "962e062e-6f8a-4dcc-95b5-bebc83f2fdc5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.497533 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.497573 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.497583 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.497591 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.497600 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jrqr\" (UniqueName: \"kubernetes.io/projected/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5-kube-api-access-8jrqr\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.747245 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="dnsmasq-dns" containerID="cri-o://ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377" gracePeriod=10 Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.749805 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.787231 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" podStartSLOduration=6.787214588 podStartE2EDuration="6.787214588s" podCreationTimestamp="2026-01-22 14:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:03:57.762734487 +0000 UTC m=+1074.317777660" watchObservedRunningTime="2026-01-22 14:03:57.787214588 +0000 UTC m=+1074.342257751" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.794151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.794182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" event={"ID":"f7ee9164-0b2f-41d4-81f4-117acae13511","Type":"ContainerStarted","Data":"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377"} Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.794196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-kd44g" event={"ID":"962e062e-6f8a-4dcc-95b5-bebc83f2fdc5","Type":"ContainerDied","Data":"2b745f07cfdd7a0d06d03f8501ab49d0efe48ae888c5e03b58fc1d55701b9f34"} Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.794216 4743 scope.go:117] "RemoveContainer" containerID="56cb8b2309a96d4d645a73069305b6df9fddafc8ea016a9387b29ad49449e099" Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.812051 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.846710 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:57 crc kubenswrapper[4743]: W0122 14:03:57.849134 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c814013_0bfc_4734_8b4c_bfb1a5b4f54d.slice/crio-0803172f026f76f1ca5198833b00bc170e2f7c760cf14659ce21e2edb438f143 WatchSource:0}: Error finding container 0803172f026f76f1ca5198833b00bc170e2f7c760cf14659ce21e2edb438f143: Status 404 returned error can't find the container with id 0803172f026f76f1ca5198833b00bc170e2f7c760cf14659ce21e2edb438f143 Jan 22 14:03:57 crc kubenswrapper[4743]: I0122 14:03:57.857921 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-kd44g"] Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.302448 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.311204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n98km\" (UniqueName: \"kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km\") pod \"f7ee9164-0b2f-41d4-81f4-117acae13511\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.311260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc\") pod \"f7ee9164-0b2f-41d4-81f4-117acae13511\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.311296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb\") pod \"f7ee9164-0b2f-41d4-81f4-117acae13511\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.311352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb\") pod \"f7ee9164-0b2f-41d4-81f4-117acae13511\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.311422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config\") pod \"f7ee9164-0b2f-41d4-81f4-117acae13511\" (UID: \"f7ee9164-0b2f-41d4-81f4-117acae13511\") " Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.369465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km" (OuterVolumeSpecName: "kube-api-access-n98km") pod "f7ee9164-0b2f-41d4-81f4-117acae13511" (UID: "f7ee9164-0b2f-41d4-81f4-117acae13511"). InnerVolumeSpecName "kube-api-access-n98km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.397133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config" (OuterVolumeSpecName: "config") pod "f7ee9164-0b2f-41d4-81f4-117acae13511" (UID: "f7ee9164-0b2f-41d4-81f4-117acae13511"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.399959 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7ee9164-0b2f-41d4-81f4-117acae13511" (UID: "f7ee9164-0b2f-41d4-81f4-117acae13511"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.401230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7ee9164-0b2f-41d4-81f4-117acae13511" (UID: "f7ee9164-0b2f-41d4-81f4-117acae13511"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.412991 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n98km\" (UniqueName: \"kubernetes.io/projected/f7ee9164-0b2f-41d4-81f4-117acae13511-kube-api-access-n98km\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.413023 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.413033 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.413043 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.429360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7ee9164-0b2f-41d4-81f4-117acae13511" (UID: "f7ee9164-0b2f-41d4-81f4-117acae13511"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.513693 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7ee9164-0b2f-41d4-81f4-117acae13511-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.767296 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerID="63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c" exitCode=0 Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.767488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" event={"ID":"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d","Type":"ContainerDied","Data":"63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c"} Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.767728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" event={"ID":"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d","Type":"ContainerStarted","Data":"0803172f026f76f1ca5198833b00bc170e2f7c760cf14659ce21e2edb438f143"} Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.776518 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerID="ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377" exitCode=0 Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.776655 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.777288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" event={"ID":"f7ee9164-0b2f-41d4-81f4-117acae13511","Type":"ContainerDied","Data":"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377"} Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.777373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-qhp4l" event={"ID":"f7ee9164-0b2f-41d4-81f4-117acae13511","Type":"ContainerDied","Data":"83d23f0111a1345c2c4f9a113d7d1dc637451daa8ceabdf901e9977c6bfe87b6"} Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.777422 4743 scope.go:117] "RemoveContainer" containerID="ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377" Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.822087 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:58 crc kubenswrapper[4743]: I0122 14:03:58.828858 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-qhp4l"] Jan 22 14:03:59 crc kubenswrapper[4743]: I0122 14:03:59.761690 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" path="/var/lib/kubelet/pods/962e062e-6f8a-4dcc-95b5-bebc83f2fdc5/volumes" Jan 22 14:03:59 crc kubenswrapper[4743]: I0122 14:03:59.762355 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" path="/var/lib/kubelet/pods/f7ee9164-0b2f-41d4-81f4-117acae13511/volumes" Jan 22 14:03:59 crc kubenswrapper[4743]: I0122 14:03:59.983305 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.013278 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:04:00 crc kubenswrapper[4743]: E0122 14:04:00.013735 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="dnsmasq-dns" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.013760 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="dnsmasq-dns" Jan 22 14:04:00 crc kubenswrapper[4743]: E0122 14:04:00.013803 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="init" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.013812 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="init" Jan 22 14:04:00 crc kubenswrapper[4743]: E0122 14:04:00.013843 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" containerName="init" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.013851 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" containerName="init" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.014091 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ee9164-0b2f-41d4-81f4-117acae13511" containerName="dnsmasq-dns" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.014109 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="962e062e-6f8a-4dcc-95b5-bebc83f2fdc5" containerName="init" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.015196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.018046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.049181 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.049519 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.052059 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.097505 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.107362 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b7fb54dc6-5q9jf"] Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.112338 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.121445 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b7fb54dc6-5q9jf"] Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.186366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.187039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgjfg\" (UniqueName: \"kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.187255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.187371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.187561 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.187819 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.188434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.290147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-combined-ca-bundle\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.290217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-tls-certs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56r6d\" (UniqueName: \"kubernetes.io/projected/e452af10-fc11-4854-bf38-8a90856331d3-kube-api-access-56r6d\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-scripts\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgjfg\" (UniqueName: \"kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-secret-key\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e452af10-fc11-4854-bf38-8a90856331d3-logs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.291960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-config-data\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.292397 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.293117 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.295657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.298940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.299199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.300698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.309730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgjfg\" (UniqueName: \"kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg\") pod \"horizon-999bfcdc8-ldzdp\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.341887 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e452af10-fc11-4854-bf38-8a90856331d3-logs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-config-data\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-combined-ca-bundle\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-tls-certs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56r6d\" (UniqueName: \"kubernetes.io/projected/e452af10-fc11-4854-bf38-8a90856331d3-kube-api-access-56r6d\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-scripts\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-secret-key\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.393953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e452af10-fc11-4854-bf38-8a90856331d3-logs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.395144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-scripts\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.395438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e452af10-fc11-4854-bf38-8a90856331d3-config-data\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.397313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-tls-certs\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.398299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-combined-ca-bundle\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.398933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e452af10-fc11-4854-bf38-8a90856331d3-horizon-secret-key\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.411685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56r6d\" (UniqueName: \"kubernetes.io/projected/e452af10-fc11-4854-bf38-8a90856331d3-kube-api-access-56r6d\") pod \"horizon-b7fb54dc6-5q9jf\" (UID: \"e452af10-fc11-4854-bf38-8a90856331d3\") " pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:00 crc kubenswrapper[4743]: I0122 14:04:00.444902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:01 crc kubenswrapper[4743]: I0122 14:04:01.811105 4743 generic.go:334] "Generic (PLEG): container finished" podID="b18fb1ec-4b3a-4369-824c-56d5512b2cb4" containerID="46eea21368dd54b81c103d7cd5b77b39db2dd3d45d5f81b1cb894a0b3d6ab5ba" exitCode=0 Jan 22 14:04:01 crc kubenswrapper[4743]: I0122 14:04:01.811179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gv54v" event={"ID":"b18fb1ec-4b3a-4369-824c-56d5512b2cb4","Type":"ContainerDied","Data":"46eea21368dd54b81c103d7cd5b77b39db2dd3d45d5f81b1cb894a0b3d6ab5ba"} Jan 22 14:04:02 crc kubenswrapper[4743]: I0122 14:04:02.821126 4743 generic.go:334] "Generic (PLEG): container finished" podID="d9518ef3-f251-4bf9-b45d-0f93876b2e7c" containerID="2c1755ff78d5f28f816a75363ac12b8522205710fd499d6b66a4f7c2a53a0f2c" exitCode=0 Jan 22 14:04:02 crc kubenswrapper[4743]: I0122 14:04:02.821581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nm9l6" event={"ID":"d9518ef3-f251-4bf9-b45d-0f93876b2e7c","Type":"ContainerDied","Data":"2c1755ff78d5f28f816a75363ac12b8522205710fd499d6b66a4f7c2a53a0f2c"} Jan 22 14:04:08 crc kubenswrapper[4743]: E0122 14:04:08.098075 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 22 14:04:08 crc kubenswrapper[4743]: E0122 14:04:08.098671 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncbh56ch59ch667h57dh545h679h5fdh65hc7h5b7h5cdh5f4hfdh678h59hcch68ch5c7h557hcch5c6h57dh5cfh5b7h677h577h9fhfdhfch66bh545q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vm42k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(316dc631-a7ed-49db-9dad-305d246bf91a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.387506 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.388137 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h568h577hc7h5fbh7fh5d5h87h56bh688h68bh65fhb7h5f4h658h5b4h59fh655h655h68ch5b4h5f4h5c6h5cfh8fh6ch566hffhb7hdfh675h568q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bln4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-fc55bf6d5-bhvcx_openstack(9bf9dca7-5e43-4ae7-abf8-36b0392b0700): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.393048 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-fc55bf6d5-bhvcx" podUID="9bf9dca7-5e43-4ae7-abf8-36b0392b0700" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.410757 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.410970 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n548h86h566h7fh5c5h5d8h596h5f6h58bh5dfhd4h75h544h7h668h74hdfh6fhfdh5c6h554hffh69h74h684h5bh564h5dbh4h56h647h677q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnjlb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7557f5c46c-q4pfj_openstack(1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.414882 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7557f5c46c-q4pfj" podUID="1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.446086 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.446363 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf9h674h55bh647h677h65fhbdh56fh96h64bh57fh4h9fhb9h5cdh645hf6h58dh567h56h89h6hf5h567h76h66dh5c7h65chfh68fhb5h5f8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lh27f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8d75d5cbf-4kt4n_openstack(c713d1a6-b56c-4179-8052-619946111c93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.449149 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8d75d5cbf-4kt4n" podUID="c713d1a6-b56c-4179-8052-619946111c93" Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.850038 4743 scope.go:117] "RemoveContainer" containerID="06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.861951 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.862084 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ssfqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tcdjz_openstack(846c118f-23c1-402f-8747-633485e743c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.863311 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tcdjz" podUID="846c118f-23c1-402f-8747-633485e743c9" Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.920370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nm9l6" event={"ID":"d9518ef3-f251-4bf9-b45d-0f93876b2e7c","Type":"ContainerDied","Data":"840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98"} Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.920419 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="840e475c0c9d2fcd80ba9b00b2865b4f6066f217271e3249f6cac6616af9ff98" Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.922167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gv54v" event={"ID":"b18fb1ec-4b3a-4369-824c-56d5512b2cb4","Type":"ContainerDied","Data":"08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c"} Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.922195 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ee05a4d189744907fd813d6de43e0ff9189cc7828ffb8125112d40260e805c" Jan 22 14:04:13 crc kubenswrapper[4743]: E0122 14:04:13.966108 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-tcdjz" podUID="846c118f-23c1-402f-8747-633485e743c9" Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.974088 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:04:13 crc kubenswrapper[4743]: I0122 14:04:13.984591 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nm9l6" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063185 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p472\" (UniqueName: \"kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472\") pod \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.063777 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data\") pod \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.064413 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle\") pod \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.064436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data\") pod \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\" (UID: \"d9518ef3-f251-4bf9-b45d-0f93876b2e7c\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.064492 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxns4\" (UniqueName: \"kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4\") pod \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\" (UID: \"b18fb1ec-4b3a-4369-824c-56d5512b2cb4\") " Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.078448 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts" (OuterVolumeSpecName: "scripts") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.086022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4" (OuterVolumeSpecName: "kube-api-access-kxns4") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "kube-api-access-kxns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.086123 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.086385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472" (OuterVolumeSpecName: "kube-api-access-6p472") pod "d9518ef3-f251-4bf9-b45d-0f93876b2e7c" (UID: "d9518ef3-f251-4bf9-b45d-0f93876b2e7c"). InnerVolumeSpecName "kube-api-access-6p472". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.088667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.089516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d9518ef3-f251-4bf9-b45d-0f93876b2e7c" (UID: "d9518ef3-f251-4bf9-b45d-0f93876b2e7c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.098167 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data" (OuterVolumeSpecName: "config-data") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.107294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b18fb1ec-4b3a-4369-824c-56d5512b2cb4" (UID: "b18fb1ec-4b3a-4369-824c-56d5512b2cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.108332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9518ef3-f251-4bf9-b45d-0f93876b2e7c" (UID: "d9518ef3-f251-4bf9-b45d-0f93876b2e7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.122622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data" (OuterVolumeSpecName: "config-data") pod "d9518ef3-f251-4bf9-b45d-0f93876b2e7c" (UID: "d9518ef3-f251-4bf9-b45d-0f93876b2e7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166782 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166826 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166837 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166847 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxns4\" (UniqueName: \"kubernetes.io/projected/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-kube-api-access-kxns4\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166859 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166867 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166876 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166884 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166892 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p472\" (UniqueName: \"kubernetes.io/projected/d9518ef3-f251-4bf9-b45d-0f93876b2e7c-kube-api-access-6p472\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.166900 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b18fb1ec-4b3a-4369-824c-56d5512b2cb4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.929688 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nm9l6" Jan 22 14:04:14 crc kubenswrapper[4743]: I0122 14:04:14.929714 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gv54v" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.098024 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gv54v"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.104892 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gv54v"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.210906 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vfzrn"] Jan 22 14:04:15 crc kubenswrapper[4743]: E0122 14:04:15.211299 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18fb1ec-4b3a-4369-824c-56d5512b2cb4" containerName="keystone-bootstrap" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.211319 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18fb1ec-4b3a-4369-824c-56d5512b2cb4" containerName="keystone-bootstrap" Jan 22 14:04:15 crc kubenswrapper[4743]: E0122 14:04:15.211346 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9518ef3-f251-4bf9-b45d-0f93876b2e7c" containerName="glance-db-sync" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.211353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9518ef3-f251-4bf9-b45d-0f93876b2e7c" containerName="glance-db-sync" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.211518 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9518ef3-f251-4bf9-b45d-0f93876b2e7c" containerName="glance-db-sync" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.211534 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18fb1ec-4b3a-4369-824c-56d5512b2cb4" containerName="keystone-bootstrap" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.212129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.215097 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.215135 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.216407 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.216602 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.216655 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-48lf7" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.220312 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vfzrn"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.397856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.398115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.398147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.398235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.398475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsll6\" (UniqueName: \"kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.398654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.411390 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.466524 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.468987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.496082 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501443 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.501586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsll6\" (UniqueName: \"kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.519746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.523185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.529487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.531333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.532383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.535363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsll6\" (UniqueName: \"kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6\") pod \"keystone-bootstrap-vfzrn\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4rf\" (UniqueName: \"kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.603946 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.704895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.704955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.704992 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4rf\" (UniqueName: \"kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.705026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.705048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.705093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.705964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.706038 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.706168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.706376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.706479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.726088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4rf\" (UniqueName: \"kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf\") pod \"dnsmasq-dns-785d8bcb8c-mdc9s\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.757439 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18fb1ec-4b3a-4369-824c-56d5512b2cb4" path="/var/lib/kubelet/pods/b18fb1ec-4b3a-4369-824c-56d5512b2cb4/volumes" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.786377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:15 crc kubenswrapper[4743]: I0122 14:04:15.834279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.450549 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.452955 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.455942 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.455957 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7dmrc" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.456222 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.489984 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcc5\" (UniqueName: \"kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.624942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.650938 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.652270 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.654431 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.668715 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcc5\" (UniqueName: \"kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.726949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.727256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.727616 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.738936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.746112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.759947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcc5\" (UniqueName: \"kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.760532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.768136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.778374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.828604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.828668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vmn\" (UniqueName: \"kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.828690 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.828905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.829007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.829144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.829225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.930896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vmn\" (UniqueName: \"kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.930935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.930979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.931858 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.932112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.935484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.935734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.941375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.951771 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vmn\" (UniqueName: \"kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:16 crc kubenswrapper[4743]: I0122 14:04:16.966147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:17 crc kubenswrapper[4743]: I0122 14:04:17.267568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:17 crc kubenswrapper[4743]: I0122 14:04:17.869351 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:17 crc kubenswrapper[4743]: I0122 14:04:17.941480 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:23 crc kubenswrapper[4743]: I0122 14:04:23.980818 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:04:23 crc kubenswrapper[4743]: I0122 14:04:23.990475 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:04:23 crc kubenswrapper[4743]: I0122 14:04:23.995954 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.012335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d75d5cbf-4kt4n" event={"ID":"c713d1a6-b56c-4179-8052-619946111c93","Type":"ContainerDied","Data":"4f291bcf7d8465275cfcc25b42255fcf7c6422fe80b93eda1a016592df3cebfe"} Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.012349 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d75d5cbf-4kt4n" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.016696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc55bf6d5-bhvcx" event={"ID":"9bf9dca7-5e43-4ae7-abf8-36b0392b0700","Type":"ContainerDied","Data":"57161db4b37969fe16692534f505a364e4ed74e3fa9e40d4f3ca5cc5a3083579"} Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.016769 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc55bf6d5-bhvcx" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.020580 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7557f5c46c-q4pfj" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.020156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7557f5c46c-q4pfj" event={"ID":"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5","Type":"ContainerDied","Data":"4147acfcb775588f97c2afe81156e407bd40f4646bbc28df4301e009b0be2b5d"} Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123699 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key\") pod \"c713d1a6-b56c-4179-8052-619946111c93\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh27f\" (UniqueName: \"kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f\") pod \"c713d1a6-b56c-4179-8052-619946111c93\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts\") pod \"c713d1a6-b56c-4179-8052-619946111c93\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts\") pod \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data\") pod \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.123988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs\") pod \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124009 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key\") pod \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data\") pod \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs\") pod \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bln4\" (UniqueName: \"kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4\") pod \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data\") pod \"c713d1a6-b56c-4179-8052-619946111c93\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs\") pod \"c713d1a6-b56c-4179-8052-619946111c93\" (UID: \"c713d1a6-b56c-4179-8052-619946111c93\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnjlb\" (UniqueName: \"kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb\") pod \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\" (UID: \"1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124306 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key\") pod \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.124345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts\") pod \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\" (UID: \"9bf9dca7-5e43-4ae7-abf8-36b0392b0700\") " Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs" (OuterVolumeSpecName: "logs") pod "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" (UID: "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts" (OuterVolumeSpecName: "scripts") pod "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" (UID: "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts" (OuterVolumeSpecName: "scripts") pod "c713d1a6-b56c-4179-8052-619946111c93" (UID: "c713d1a6-b56c-4179-8052-619946111c93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs" (OuterVolumeSpecName: "logs") pod "c713d1a6-b56c-4179-8052-619946111c93" (UID: "c713d1a6-b56c-4179-8052-619946111c93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125670 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c713d1a6-b56c-4179-8052-619946111c93-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125685 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125693 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125702 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs" (OuterVolumeSpecName: "logs") pod "9bf9dca7-5e43-4ae7-abf8-36b0392b0700" (UID: "9bf9dca7-5e43-4ae7-abf8-36b0392b0700"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125912 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data" (OuterVolumeSpecName: "config-data") pod "9bf9dca7-5e43-4ae7-abf8-36b0392b0700" (UID: "9bf9dca7-5e43-4ae7-abf8-36b0392b0700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.125944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts" (OuterVolumeSpecName: "scripts") pod "9bf9dca7-5e43-4ae7-abf8-36b0392b0700" (UID: "9bf9dca7-5e43-4ae7-abf8-36b0392b0700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.126019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data" (OuterVolumeSpecName: "config-data") pod "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" (UID: "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.126479 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data" (OuterVolumeSpecName: "config-data") pod "c713d1a6-b56c-4179-8052-619946111c93" (UID: "c713d1a6-b56c-4179-8052-619946111c93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.128437 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f" (OuterVolumeSpecName: "kube-api-access-lh27f") pod "c713d1a6-b56c-4179-8052-619946111c93" (UID: "c713d1a6-b56c-4179-8052-619946111c93"). InnerVolumeSpecName "kube-api-access-lh27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.128526 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c713d1a6-b56c-4179-8052-619946111c93" (UID: "c713d1a6-b56c-4179-8052-619946111c93"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.128575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" (UID: "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.130552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9bf9dca7-5e43-4ae7-abf8-36b0392b0700" (UID: "9bf9dca7-5e43-4ae7-abf8-36b0392b0700"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.131164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4" (OuterVolumeSpecName: "kube-api-access-5bln4") pod "9bf9dca7-5e43-4ae7-abf8-36b0392b0700" (UID: "9bf9dca7-5e43-4ae7-abf8-36b0392b0700"). InnerVolumeSpecName "kube-api-access-5bln4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.148622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb" (OuterVolumeSpecName: "kube-api-access-jnjlb") pod "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" (UID: "1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5"). InnerVolumeSpecName "kube-api-access-jnjlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227250 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh27f\" (UniqueName: \"kubernetes.io/projected/c713d1a6-b56c-4179-8052-619946111c93-kube-api-access-lh27f\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227298 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227311 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227323 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227334 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227346 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bln4\" (UniqueName: \"kubernetes.io/projected/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-kube-api-access-5bln4\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227358 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c713d1a6-b56c-4179-8052-619946111c93-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227368 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnjlb\" (UniqueName: \"kubernetes.io/projected/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5-kube-api-access-jnjlb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227379 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227390 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf9dca7-5e43-4ae7-abf8-36b0392b0700-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.227400 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c713d1a6-b56c-4179-8052-619946111c93-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.376997 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.385477 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8d75d5cbf-4kt4n"] Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.421081 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.439077 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7557f5c46c-q4pfj"] Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.451602 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:04:24 crc kubenswrapper[4743]: I0122 14:04:24.457248 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fc55bf6d5-bhvcx"] Jan 22 14:04:25 crc kubenswrapper[4743]: I0122 14:04:25.760764 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5" path="/var/lib/kubelet/pods/1a50ac3e-2e28-4edf-85a0-0a18c00d6dd5/volumes" Jan 22 14:04:25 crc kubenswrapper[4743]: I0122 14:04:25.761590 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf9dca7-5e43-4ae7-abf8-36b0392b0700" path="/var/lib/kubelet/pods/9bf9dca7-5e43-4ae7-abf8-36b0392b0700/volumes" Jan 22 14:04:25 crc kubenswrapper[4743]: I0122 14:04:25.762359 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c713d1a6-b56c-4179-8052-619946111c93" path="/var/lib/kubelet/pods/c713d1a6-b56c-4179-8052-619946111c93/volumes" Jan 22 14:04:26 crc kubenswrapper[4743]: E0122 14:04:26.776249 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 22 14:04:26 crc kubenswrapper[4743]: E0122 14:04:26.776649 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kgm5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-srjxw_openstack(eb22345c-594c-46a3-b362-e34baa8f271c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:04:26 crc kubenswrapper[4743]: E0122 14:04:26.778960 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-srjxw" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" Jan 22 14:04:26 crc kubenswrapper[4743]: I0122 14:04:26.996970 4743 scope.go:117] "RemoveContainer" containerID="ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377" Jan 22 14:04:26 crc kubenswrapper[4743]: E0122 14:04:26.997296 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377\": container with ID starting with ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377 not found: ID does not exist" containerID="ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377" Jan 22 14:04:26 crc kubenswrapper[4743]: I0122 14:04:26.997323 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377"} err="failed to get container status \"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377\": rpc error: code = NotFound desc = could not find container \"ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377\": container with ID starting with ce3a2fa682940db02f2eb2a4e96f910efeef0578f5a86b47d59b56d54a73f377 not found: ID does not exist" Jan 22 14:04:26 crc kubenswrapper[4743]: I0122 14:04:26.997342 4743 scope.go:117] "RemoveContainer" containerID="06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541" Jan 22 14:04:26 crc kubenswrapper[4743]: E0122 14:04:26.997550 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541\": container with ID starting with 06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541 not found: ID does not exist" containerID="06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541" Jan 22 14:04:26 crc kubenswrapper[4743]: I0122 14:04:26.997576 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541"} err="failed to get container status \"06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541\": rpc error: code = NotFound desc = could not find container \"06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541\": container with ID starting with 06a9e9f267a1d3d4292137aced31d575f5bef9a8ea3f5f50e9dab98fc51db541 not found: ID does not exist" Jan 22 14:04:27 crc kubenswrapper[4743]: E0122 14:04:27.091686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-srjxw" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.111900 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:04:27 crc kubenswrapper[4743]: W0122 14:04:27.172198 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddff52751_78f1_4c39_aa95_5d74a246151e.slice/crio-269ef51223792898fc237f2544de07afb4f8e70db37f8fd29799954631ede201 WatchSource:0}: Error finding container 269ef51223792898fc237f2544de07afb4f8e70db37f8fd29799954631ede201: Status 404 returned error can't find the container with id 269ef51223792898fc237f2544de07afb4f8e70db37f8fd29799954631ede201 Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.441291 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b7fb54dc6-5q9jf"] Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.715220 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:27 crc kubenswrapper[4743]: W0122 14:04:27.724320 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b50d81f_9274_4e49_b27c_0452022c096a.slice/crio-c6fc9d1c1e6189fdc33b86dc8d26f5272e9ca5d6bd53153b9e70208b413df9ae WatchSource:0}: Error finding container c6fc9d1c1e6189fdc33b86dc8d26f5272e9ca5d6bd53153b9e70208b413df9ae: Status 404 returned error can't find the container with id c6fc9d1c1e6189fdc33b86dc8d26f5272e9ca5d6bd53153b9e70208b413df9ae Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.770723 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vfzrn"] Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.784772 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:27 crc kubenswrapper[4743]: I0122 14:04:27.926761 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:27 crc kubenswrapper[4743]: W0122 14:04:27.946018 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe93c97_1c15_4f6e_a638_a9bee6b88f7b.slice/crio-3a7aa56bf644d039c0bb6658a0b7bd66d1c872f8683c6d8a7082b14166d31ce4 WatchSource:0}: Error finding container 3a7aa56bf644d039c0bb6658a0b7bd66d1c872f8683c6d8a7082b14166d31ce4: Status 404 returned error can't find the container with id 3a7aa56bf644d039c0bb6658a0b7bd66d1c872f8683c6d8a7082b14166d31ce4 Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.077567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerStarted","Data":"ac18ac033baaefe887d069af8ebe8ff96c5cd0940424da66bfb23d7a56c5a2f7"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.078452 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerStarted","Data":"269ef51223792898fc237f2544de07afb4f8e70db37f8fd29799954631ede201"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.080088 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b7fb54dc6-5q9jf" event={"ID":"e452af10-fc11-4854-bf38-8a90856331d3","Type":"ContainerStarted","Data":"93f2decfa749e21eb2c7313c5553af9bd7f4dc021fa5c18e8743f314409ccea3"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.083667 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9t996" event={"ID":"4ebac6d9-df0f-41fe-bc73-8236847ff237","Type":"ContainerStarted","Data":"c21f41d3b2d4091b7aa24b83cec34f0c483edd38f0a2db502df9a85b60d4aec0"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.090703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" event={"ID":"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d","Type":"ContainerStarted","Data":"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.090758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.090812 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="dnsmasq-dns" containerID="cri-o://b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807" gracePeriod=10 Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.095990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerStarted","Data":"c6fc9d1c1e6189fdc33b86dc8d26f5272e9ca5d6bd53153b9e70208b413df9ae"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.100636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerStarted","Data":"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.107468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerStarted","Data":"3a7aa56bf644d039c0bb6658a0b7bd66d1c872f8683c6d8a7082b14166d31ce4"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.112323 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vfzrn" event={"ID":"cbe93f39-887c-4949-9e78-1047998f8aff","Type":"ContainerStarted","Data":"c871dc8973ea2e91fd44264fe34f88f81d8c63e19143a693c0892bad127fbe86"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.112380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vfzrn" event={"ID":"cbe93f39-887c-4949-9e78-1047998f8aff","Type":"ContainerStarted","Data":"6231a912f700a503eebd4cbcd277cca5b48d8685793033255fda9768a694b133"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.115286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" event={"ID":"f840eb1a-7e3e-4aa4-acea-96cefb593807","Type":"ContainerStarted","Data":"93f9542514b86f7e1bd7e9465e864555f60b686cbfa10de04c0a47377b055904"} Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.116626 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9t996" podStartSLOduration=8.167684057 podStartE2EDuration="37.11660335s" podCreationTimestamp="2026-01-22 14:03:51 +0000 UTC" firstStartedPulling="2026-01-22 14:03:54.915193435 +0000 UTC m=+1071.470236598" lastFinishedPulling="2026-01-22 14:04:23.864112728 +0000 UTC m=+1100.419155891" observedRunningTime="2026-01-22 14:04:28.100508725 +0000 UTC m=+1104.655551888" watchObservedRunningTime="2026-01-22 14:04:28.11660335 +0000 UTC m=+1104.671646513" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.124187 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" podStartSLOduration=31.124156354 podStartE2EDuration="31.124156354s" podCreationTimestamp="2026-01-22 14:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:28.121368612 +0000 UTC m=+1104.676411775" watchObservedRunningTime="2026-01-22 14:04:28.124156354 +0000 UTC m=+1104.679199527" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.160690 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vfzrn" podStartSLOduration=13.160674736 podStartE2EDuration="13.160674736s" podCreationTimestamp="2026-01-22 14:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:28.154296531 +0000 UTC m=+1104.709339694" watchObservedRunningTime="2026-01-22 14:04:28.160674736 +0000 UTC m=+1104.715717899" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.489870 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.638570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.638658 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.638711 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.638727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsvtr\" (UniqueName: \"kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.638829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.642900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0\") pod \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\" (UID: \"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d\") " Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.661715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr" (OuterVolumeSpecName: "kube-api-access-zsvtr") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "kube-api-access-zsvtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.707335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.708987 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.711498 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config" (OuterVolumeSpecName: "config") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.728997 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.744880 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.745260 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.745368 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.745444 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsvtr\" (UniqueName: \"kubernetes.io/projected/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-kube-api-access-zsvtr\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.745509 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.745224 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" (UID: "6c814013-0bfc-4734-8b4c-bfb1a5b4f54d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:28 crc kubenswrapper[4743]: I0122 14:04:28.846940 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.133732 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8bd2850-37f2-40c9-aeb5-365158ca9716" containerID="37b8e4844c7a5a524a68d32dc8c01f0e84365b3babaf0e9d46244f75aa6b4152" exitCode=0 Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.133827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fqwwj" event={"ID":"b8bd2850-37f2-40c9-aeb5-365158ca9716","Type":"ContainerDied","Data":"37b8e4844c7a5a524a68d32dc8c01f0e84365b3babaf0e9d46244f75aa6b4152"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.137778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerStarted","Data":"3ba0cf14795531f6bed1d070f6f5c9124107ccd684d3b7f8219ba23779fff312"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.142515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerStarted","Data":"63e33f82bc66858f7646d8d52106929147dca59a06e0ccf6f7838a6c4813eb9f"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.145744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tcdjz" event={"ID":"846c118f-23c1-402f-8747-633485e743c9","Type":"ContainerStarted","Data":"cc13199c0a241e6bacabaf8f4c251225db6761a2f3d5aaee1461d5ab8520ddbe"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.173040 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b7fb54dc6-5q9jf" event={"ID":"e452af10-fc11-4854-bf38-8a90856331d3","Type":"ContainerStarted","Data":"d145c221f62dc7e1041cb5df47bf13c86db9e2a811d74d93b950cdffc4db66f8"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.173099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b7fb54dc6-5q9jf" event={"ID":"e452af10-fc11-4854-bf38-8a90856331d3","Type":"ContainerStarted","Data":"81404e6f78d319784cc5d33e42cd5a6604d2ca85aa8513b995639ae67e65ad7b"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.186266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerStarted","Data":"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.190152 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerID="b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807" exitCode=0 Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.190218 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" event={"ID":"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d","Type":"ContainerDied","Data":"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.190246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" event={"ID":"6c814013-0bfc-4734-8b4c-bfb1a5b4f54d","Type":"ContainerDied","Data":"0803172f026f76f1ca5198833b00bc170e2f7c760cf14659ce21e2edb438f143"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.190264 4743 scope.go:117] "RemoveContainer" containerID="b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.190403 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-cc882" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.196931 4743 generic.go:334] "Generic (PLEG): container finished" podID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerID="baa47e0a3805a9c63578e2972a3d09cd307563f0f73e9883289ba252adace558" exitCode=0 Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.197724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" event={"ID":"f840eb1a-7e3e-4aa4-acea-96cefb593807","Type":"ContainerDied","Data":"baa47e0a3805a9c63578e2972a3d09cd307563f0f73e9883289ba252adace558"} Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.250259 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tcdjz" podStartSLOduration=4.908170138 podStartE2EDuration="38.250240259s" podCreationTimestamp="2026-01-22 14:03:51 +0000 UTC" firstStartedPulling="2026-01-22 14:03:54.923441028 +0000 UTC m=+1071.478484191" lastFinishedPulling="2026-01-22 14:04:28.265511159 +0000 UTC m=+1104.820554312" observedRunningTime="2026-01-22 14:04:29.176637321 +0000 UTC m=+1105.731680494" watchObservedRunningTime="2026-01-22 14:04:29.250240259 +0000 UTC m=+1105.805283422" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.273092 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-999bfcdc8-ldzdp" podStartSLOduration=29.834871209 podStartE2EDuration="30.273069177s" podCreationTimestamp="2026-01-22 14:03:59 +0000 UTC" firstStartedPulling="2026-01-22 14:04:27.181174812 +0000 UTC m=+1103.736217965" lastFinishedPulling="2026-01-22 14:04:27.61937277 +0000 UTC m=+1104.174415933" observedRunningTime="2026-01-22 14:04:29.20258472 +0000 UTC m=+1105.757627883" watchObservedRunningTime="2026-01-22 14:04:29.273069177 +0000 UTC m=+1105.828112340" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.303898 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b7fb54dc6-5q9jf" podStartSLOduration=28.80040517 podStartE2EDuration="29.303881092s" podCreationTimestamp="2026-01-22 14:04:00 +0000 UTC" firstStartedPulling="2026-01-22 14:04:27.448293129 +0000 UTC m=+1104.003336292" lastFinishedPulling="2026-01-22 14:04:27.951769061 +0000 UTC m=+1104.506812214" observedRunningTime="2026-01-22 14:04:29.222843702 +0000 UTC m=+1105.777886875" watchObservedRunningTime="2026-01-22 14:04:29.303881092 +0000 UTC m=+1105.858924255" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.332088 4743 scope.go:117] "RemoveContainer" containerID="63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.370252 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.388365 4743 scope.go:117] "RemoveContainer" containerID="b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.388588 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-cc882"] Jan 22 14:04:29 crc kubenswrapper[4743]: E0122 14:04:29.395455 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807\": container with ID starting with b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807 not found: ID does not exist" containerID="b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.395505 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807"} err="failed to get container status \"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807\": rpc error: code = NotFound desc = could not find container \"b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807\": container with ID starting with b78ed07439930e7ccaf1ce98cce93c09660ec8cdd91dde9db0c9f00f6d694807 not found: ID does not exist" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.395537 4743 scope.go:117] "RemoveContainer" containerID="63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c" Jan 22 14:04:29 crc kubenswrapper[4743]: E0122 14:04:29.397194 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c\": container with ID starting with 63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c not found: ID does not exist" containerID="63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.397227 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c"} err="failed to get container status \"63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c\": rpc error: code = NotFound desc = could not find container \"63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c\": container with ID starting with 63ddeda3d533b73b10968d70c6d9fa7330075e705981727d886e45592a5a221c not found: ID does not exist" Jan 22 14:04:29 crc kubenswrapper[4743]: I0122 14:04:29.764901 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" path="/var/lib/kubelet/pods/6c814013-0bfc-4734-8b4c-bfb1a5b4f54d/volumes" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.048962 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.049027 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.049129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.049948 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.050360 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324" gracePeriod=600 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.207150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerStarted","Data":"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43"} Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.207314 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-log" containerID="cri-o://8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" gracePeriod=30 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.207323 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-httpd" containerID="cri-o://494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" gracePeriod=30 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.210657 4743 generic.go:334] "Generic (PLEG): container finished" podID="4ebac6d9-df0f-41fe-bc73-8236847ff237" containerID="c21f41d3b2d4091b7aa24b83cec34f0c483edd38f0a2db502df9a85b60d4aec0" exitCode=0 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.210758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9t996" event={"ID":"4ebac6d9-df0f-41fe-bc73-8236847ff237","Type":"ContainerDied","Data":"c21f41d3b2d4091b7aa24b83cec34f0c483edd38f0a2db502df9a85b60d4aec0"} Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.233919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" event={"ID":"f840eb1a-7e3e-4aa4-acea-96cefb593807","Type":"ContainerStarted","Data":"139e1574369316387845d4206537c6c69143f809af2db271d28f8be12d0752f4"} Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.234221 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.240077 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324" exitCode=0 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.240136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324"} Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.240295 4743 scope.go:117] "RemoveContainer" containerID="81047d739858b8f95f8165563bcec3db2c5fc125137b4bc67b44c536e91297dc" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.244077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerStarted","Data":"c9ed1afa8c422ad0a37aa75497d29434fcced1b78471c012372038e7cdc768f4"} Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.244363 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-log" containerID="cri-o://3ba0cf14795531f6bed1d070f6f5c9124107ccd684d3b7f8219ba23779fff312" gracePeriod=30 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.244460 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-httpd" containerID="cri-o://c9ed1afa8c422ad0a37aa75497d29434fcced1b78471c012372038e7cdc768f4" gracePeriod=30 Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.254733 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.254704586999999 podStartE2EDuration="15.254704587s" podCreationTimestamp="2026-01-22 14:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:30.233505851 +0000 UTC m=+1106.788549034" watchObservedRunningTime="2026-01-22 14:04:30.254704587 +0000 UTC m=+1106.809747760" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.294370 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" podStartSLOduration=15.294349649 podStartE2EDuration="15.294349649s" podCreationTimestamp="2026-01-22 14:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:30.272878686 +0000 UTC m=+1106.827921859" watchObservedRunningTime="2026-01-22 14:04:30.294349649 +0000 UTC m=+1106.849392812" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.343767 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.343852 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.445420 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.445841 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.586426 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.610615 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.610599323 podStartE2EDuration="15.610599323s" podCreationTimestamp="2026-01-22 14:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:30.293651711 +0000 UTC m=+1106.848694894" watchObservedRunningTime="2026-01-22 14:04:30.610599323 +0000 UTC m=+1107.165642486" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.696261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle\") pod \"b8bd2850-37f2-40c9-aeb5-365158ca9716\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.696370 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5l2p\" (UniqueName: \"kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p\") pod \"b8bd2850-37f2-40c9-aeb5-365158ca9716\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.696664 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config\") pod \"b8bd2850-37f2-40c9-aeb5-365158ca9716\" (UID: \"b8bd2850-37f2-40c9-aeb5-365158ca9716\") " Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.732704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p" (OuterVolumeSpecName: "kube-api-access-s5l2p") pod "b8bd2850-37f2-40c9-aeb5-365158ca9716" (UID: "b8bd2850-37f2-40c9-aeb5-365158ca9716"). InnerVolumeSpecName "kube-api-access-s5l2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.749265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8bd2850-37f2-40c9-aeb5-365158ca9716" (UID: "b8bd2850-37f2-40c9-aeb5-365158ca9716"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.753542 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config" (OuterVolumeSpecName: "config") pod "b8bd2850-37f2-40c9-aeb5-365158ca9716" (UID: "b8bd2850-37f2-40c9-aeb5-365158ca9716"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.812930 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.812956 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5l2p\" (UniqueName: \"kubernetes.io/projected/b8bd2850-37f2-40c9-aeb5-365158ca9716-kube-api-access-s5l2p\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.812968 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8bd2850-37f2-40c9-aeb5-365158ca9716-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:30 crc kubenswrapper[4743]: I0122 14:04:30.935563 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9vmn\" (UniqueName: \"kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017836 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.017985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.018009 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts\") pod \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\" (UID: \"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b\") " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.020069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.023082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts" (OuterVolumeSpecName: "scripts") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.023596 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn" (OuterVolumeSpecName: "kube-api-access-d9vmn") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "kube-api-access-d9vmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.024546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs" (OuterVolumeSpecName: "logs") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.031704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.065473 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.095242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data" (OuterVolumeSpecName: "config-data") pod "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" (UID: "dfe93c97-1c15-4f6e-a638-a9bee6b88f7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120082 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9vmn\" (UniqueName: \"kubernetes.io/projected/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-kube-api-access-d9vmn\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120116 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120128 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120141 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120185 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120204 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.120221 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.167700 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.221624 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.273405 4743 generic.go:334] "Generic (PLEG): container finished" podID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerID="494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" exitCode=0 Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.273456 4743 generic.go:334] "Generic (PLEG): container finished" podID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerID="8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" exitCode=143 Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.273612 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.275032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerDied","Data":"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.275083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerDied","Data":"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.275099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfe93c97-1c15-4f6e-a638-a9bee6b88f7b","Type":"ContainerDied","Data":"3a7aa56bf644d039c0bb6658a0b7bd66d1c872f8683c6d8a7082b14166d31ce4"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.275120 4743 scope.go:117] "RemoveContainer" containerID="494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.286945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fqwwj" event={"ID":"b8bd2850-37f2-40c9-aeb5-365158ca9716","Type":"ContainerDied","Data":"7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.286984 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7339994d44d5afbc0dc266eeba57adf993f0d231c1885d7855c0fc1bcc19ca37" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.287121 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fqwwj" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.292903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.296094 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b50d81f-9274-4e49-b27c-0452022c096a" containerID="c9ed1afa8c422ad0a37aa75497d29434fcced1b78471c012372038e7cdc768f4" exitCode=0 Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.296121 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b50d81f-9274-4e49-b27c-0452022c096a" containerID="3ba0cf14795531f6bed1d070f6f5c9124107ccd684d3b7f8219ba23779fff312" exitCode=143 Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.296291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerDied","Data":"c9ed1afa8c422ad0a37aa75497d29434fcced1b78471c012372038e7cdc768f4"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.296322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerDied","Data":"3ba0cf14795531f6bed1d070f6f5c9124107ccd684d3b7f8219ba23779fff312"} Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.387073 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.392384 4743 scope.go:117] "RemoveContainer" containerID="8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.407426 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.429506 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.430282 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-log" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430302 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-log" Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.430314 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="init" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430320 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="init" Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.430335 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="dnsmasq-dns" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430342 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="dnsmasq-dns" Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.430373 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-httpd" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430383 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-httpd" Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.430395 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bd2850-37f2-40c9-aeb5-365158ca9716" containerName="neutron-db-sync" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430404 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bd2850-37f2-40c9-aeb5-365158ca9716" containerName="neutron-db-sync" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430601 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c814013-0bfc-4734-8b4c-bfb1a5b4f54d" containerName="dnsmasq-dns" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430634 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-httpd" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430655 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bd2850-37f2-40c9-aeb5-365158ca9716" containerName="neutron-db-sync" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.430687 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" containerName="glance-log" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.431579 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.455198 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.459982 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.460453 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.494024 4743 scope.go:117] "RemoveContainer" containerID="494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.495121 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.498959 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43\": container with ID starting with 494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43 not found: ID does not exist" containerID="494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.499017 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43"} err="failed to get container status \"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43\": rpc error: code = NotFound desc = could not find container \"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43\": container with ID starting with 494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43 not found: ID does not exist" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.499051 4743 scope.go:117] "RemoveContainer" containerID="8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" Jan 22 14:04:31 crc kubenswrapper[4743]: E0122 14:04:31.506681 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51\": container with ID starting with 8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51 not found: ID does not exist" containerID="8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.506736 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51"} err="failed to get container status \"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51\": rpc error: code = NotFound desc = could not find container \"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51\": container with ID starting with 8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51 not found: ID does not exist" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.506766 4743 scope.go:117] "RemoveContainer" containerID="494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.508458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.509982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.516887 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43"} err="failed to get container status \"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43\": rpc error: code = NotFound desc = could not find container \"494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43\": container with ID starting with 494fa1da0458f7ff372eff5f4a4fd414086f1b2509a0971be1e4e742477d3b43 not found: ID does not exist" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.517062 4743 scope.go:117] "RemoveContainer" containerID="8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.519919 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51"} err="failed to get container status \"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51\": rpc error: code = NotFound desc = could not find container \"8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51\": container with ID starting with 8755b8131e81b5bde6aaf646dc5c822513567d38ef2e07846da2d1718f1f9a51 not found: ID does not exist" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.569397 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49g9s\" (UniqueName: \"kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634573 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634605 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634671 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wcx8\" (UniqueName: \"kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.634750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.659533 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.661228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.668664 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.668699 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k87rb" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.669443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.669627 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.693296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.735879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49g9s\" (UniqueName: \"kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.735933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.735982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.736013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.736055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.736082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.736103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.736128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.737767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.737836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wcx8\" (UniqueName: \"kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738672 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.738758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.739056 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.740142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.743821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.746046 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.751161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.751740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.757549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.758582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.760022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.769448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49g9s\" (UniqueName: \"kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.788563 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.811715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wcx8\" (UniqueName: \"kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8\") pod \"dnsmasq-dns-55f844cf75-rsqh8\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.813691 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe93c97-1c15-4f6e-a638-a9bee6b88f7b" path="/var/lib/kubelet/pods/dfe93c97-1c15-4f6e-a638-a9bee6b88f7b/volumes" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.858174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.858841 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.858893 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.858932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqgw\" (UniqueName: \"kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.858982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.859012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.964143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.964212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.964341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.964385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.964424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqgw\" (UniqueName: \"kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.982624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.982695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.996597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:31 crc kubenswrapper[4743]: I0122 14:04:31.997497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.035503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqgw\" (UniqueName: \"kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw\") pod \"neutron-7bd7ccdcfb-sx7wp\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.088238 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.286996 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.315739 4743 generic.go:334] "Generic (PLEG): container finished" podID="cbe93f39-887c-4949-9e78-1047998f8aff" containerID="c871dc8973ea2e91fd44264fe34f88f81d8c63e19143a693c0892bad127fbe86" exitCode=0 Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.316240 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="dnsmasq-dns" containerID="cri-o://139e1574369316387845d4206537c6c69143f809af2db271d28f8be12d0752f4" gracePeriod=10 Jan 22 14:04:32 crc kubenswrapper[4743]: I0122 14:04:32.317150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vfzrn" event={"ID":"cbe93f39-887c-4949-9e78-1047998f8aff","Type":"ContainerDied","Data":"c871dc8973ea2e91fd44264fe34f88f81d8c63e19143a693c0892bad127fbe86"} Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.326173 4743 generic.go:334] "Generic (PLEG): container finished" podID="846c118f-23c1-402f-8747-633485e743c9" containerID="cc13199c0a241e6bacabaf8f4c251225db6761a2f3d5aaee1461d5ab8520ddbe" exitCode=0 Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.326250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tcdjz" event={"ID":"846c118f-23c1-402f-8747-633485e743c9","Type":"ContainerDied","Data":"cc13199c0a241e6bacabaf8f4c251225db6761a2f3d5aaee1461d5ab8520ddbe"} Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.330928 4743 generic.go:334] "Generic (PLEG): container finished" podID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerID="139e1574369316387845d4206537c6c69143f809af2db271d28f8be12d0752f4" exitCode=0 Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.330993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" event={"ID":"f840eb1a-7e3e-4aa4-acea-96cefb593807","Type":"ContainerDied","Data":"139e1574369316387845d4206537c6c69143f809af2db271d28f8be12d0752f4"} Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.535937 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.537673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.543214 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.543832 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.555553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bsg\" (UniqueName: \"kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.618938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.720745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bsg\" (UniqueName: \"kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.727753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.728882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.729902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.729970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.731822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.732372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.742840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bsg\" (UniqueName: \"kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg\") pod \"neutron-56bdc765bf-xnb2x\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:33 crc kubenswrapper[4743]: I0122 14:04:33.883686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:35 crc kubenswrapper[4743]: I0122 14:04:35.790217 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.284236 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.304077 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.375083 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9t996" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.462856 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.464112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tcdjz" event={"ID":"846c118f-23c1-402f-8747-633485e743c9","Type":"ContainerDied","Data":"3013ff8fd45403382013effbbc5cbdacc91176ab9a4560fce316634a57528b1f"} Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.464153 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3013ff8fd45403382013effbbc5cbdacc91176ab9a4560fce316634a57528b1f" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.464231 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tcdjz" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.474026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9t996" event={"ID":"4ebac6d9-df0f-41fe-bc73-8236847ff237","Type":"ContainerDied","Data":"6f328699c738936f75afcf4eba4fe74fde91cefdb86f53da239444c49f8893a5"} Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.474066 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f328699c738936f75afcf4eba4fe74fde91cefdb86f53da239444c49f8893a5" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.474152 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9t996" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsll6\" (UniqueName: \"kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle\") pod \"846c118f-23c1-402f-8747-633485e743c9\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data\") pod \"4ebac6d9-df0f-41fe-bc73-8236847ff237\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssfqb\" (UniqueName: \"kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb\") pod \"846c118f-23c1-402f-8747-633485e743c9\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs\") pod \"4ebac6d9-df0f-41fe-bc73-8236847ff237\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts\") pod \"4ebac6d9-df0f-41fe-bc73-8236847ff237\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data\") pod \"846c118f-23c1-402f-8747-633485e743c9\" (UID: \"846c118f-23c1-402f-8747-633485e743c9\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479765 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw4t2\" (UniqueName: \"kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2\") pod \"4ebac6d9-df0f-41fe-bc73-8236847ff237\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys\") pod \"cbe93f39-887c-4949-9e78-1047998f8aff\" (UID: \"cbe93f39-887c-4949-9e78-1047998f8aff\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.479835 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle\") pod \"4ebac6d9-df0f-41fe-bc73-8236847ff237\" (UID: \"4ebac6d9-df0f-41fe-bc73-8236847ff237\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.481516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs" (OuterVolumeSpecName: "logs") pod "4ebac6d9-df0f-41fe-bc73-8236847ff237" (UID: "4ebac6d9-df0f-41fe-bc73-8236847ff237"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.491302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vfzrn" event={"ID":"cbe93f39-887c-4949-9e78-1047998f8aff","Type":"ContainerDied","Data":"6231a912f700a503eebd4cbcd277cca5b48d8685793033255fda9768a694b133"} Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.491921 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6231a912f700a503eebd4cbcd277cca5b48d8685793033255fda9768a694b133" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.492073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts" (OuterVolumeSpecName: "scripts") pod "4ebac6d9-df0f-41fe-bc73-8236847ff237" (UID: "4ebac6d9-df0f-41fe-bc73-8236847ff237"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.493325 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vfzrn" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.497758 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts" (OuterVolumeSpecName: "scripts") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.515166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "846c118f-23c1-402f-8747-633485e743c9" (UID: "846c118f-23c1-402f-8747-633485e743c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.515469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2" (OuterVolumeSpecName: "kube-api-access-fw4t2") pod "4ebac6d9-df0f-41fe-bc73-8236847ff237" (UID: "4ebac6d9-df0f-41fe-bc73-8236847ff237"). InnerVolumeSpecName "kube-api-access-fw4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.518998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6" (OuterVolumeSpecName: "kube-api-access-xsll6") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "kube-api-access-xsll6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.522219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.522319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b50d81f-9274-4e49-b27c-0452022c096a","Type":"ContainerDied","Data":"c6fc9d1c1e6189fdc33b86dc8d26f5272e9ca5d6bd53153b9e70208b413df9ae"} Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.522371 4743 scope.go:117] "RemoveContainer" containerID="c9ed1afa8c422ad0a37aa75497d29434fcced1b78471c012372038e7cdc768f4" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.522411 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.523033 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.527114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb" (OuterVolumeSpecName: "kube-api-access-ssfqb") pod "846c118f-23c1-402f-8747-633485e743c9" (UID: "846c118f-23c1-402f-8747-633485e743c9"). InnerVolumeSpecName "kube-api-access-ssfqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.553033 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data" (OuterVolumeSpecName: "config-data") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.557444 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.566733 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data" (OuterVolumeSpecName: "config-data") pod "4ebac6d9-df0f-41fe-bc73-8236847ff237" (UID: "4ebac6d9-df0f-41fe-bc73-8236847ff237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584399 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584485 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584598 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcc5\" (UniqueName: \"kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.584701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run\") pod \"1b50d81f-9274-4e49-b27c-0452022c096a\" (UID: \"1b50d81f-9274-4e49-b27c-0452022c096a\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585053 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw4t2\" (UniqueName: \"kubernetes.io/projected/4ebac6d9-df0f-41fe-bc73-8236847ff237-kube-api-access-fw4t2\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585069 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585079 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsll6\" (UniqueName: \"kubernetes.io/projected/cbe93f39-887c-4949-9e78-1047998f8aff-kube-api-access-xsll6\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585087 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585095 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585104 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssfqb\" (UniqueName: \"kubernetes.io/projected/846c118f-23c1-402f-8747-633485e743c9-kube-api-access-ssfqb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585112 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ebac6d9-df0f-41fe-bc73-8236847ff237-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585121 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585129 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585136 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.585144 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.587328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.588294 4743 scope.go:117] "RemoveContainer" containerID="3ba0cf14795531f6bed1d070f6f5c9124107ccd684d3b7f8219ba23779fff312" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.590896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs" (OuterVolumeSpecName: "logs") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.601431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "846c118f-23c1-402f-8747-633485e743c9" (UID: "846c118f-23c1-402f-8747-633485e743c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.602966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5" (OuterVolumeSpecName: "kube-api-access-rfcc5") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "kube-api-access-rfcc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.605001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ebac6d9-df0f-41fe-bc73-8236847ff237" (UID: "4ebac6d9-df0f-41fe-bc73-8236847ff237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.605188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts" (OuterVolumeSpecName: "scripts") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.606828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.632927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.662989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbe93f39-887c-4949-9e78-1047998f8aff" (UID: "cbe93f39-887c-4949-9e78-1047998f8aff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.673939 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data" (OuterVolumeSpecName: "config-data") pod "1b50d81f-9274-4e49-b27c-0452022c096a" (UID: "1b50d81f-9274-4e49-b27c-0452022c096a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686740 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4rf\" (UniqueName: \"kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.686802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb\") pod \"f840eb1a-7e3e-4aa4-acea-96cefb593807\" (UID: \"f840eb1a-7e3e-4aa4-acea-96cefb593807\") " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687151 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687178 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687189 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687199 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebac6d9-df0f-41fe-bc73-8236847ff237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687208 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687218 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcc5\" (UniqueName: \"kubernetes.io/projected/1b50d81f-9274-4e49-b27c-0452022c096a-kube-api-access-rfcc5\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687229 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b50d81f-9274-4e49-b27c-0452022c096a-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687238 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846c118f-23c1-402f-8747-633485e743c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687247 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b50d81f-9274-4e49-b27c-0452022c096a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.687255 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93f39-887c-4949-9e78-1047998f8aff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.697519 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf" (OuterVolumeSpecName: "kube-api-access-8k4rf") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "kube-api-access-8k4rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.707569 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.730580 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.748649 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.749194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.749250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config" (OuterVolumeSpecName: "config") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.788899 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.788939 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.788955 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.788966 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.789003 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4rf\" (UniqueName: \"kubernetes.io/projected/f840eb1a-7e3e-4aa4-acea-96cefb593807-kube-api-access-8k4rf\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.789016 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.801171 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f840eb1a-7e3e-4aa4-acea-96cefb593807" (UID: "f840eb1a-7e3e-4aa4-acea-96cefb593807"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.806836 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.870427 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.883894 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.890396 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f840eb1a-7e3e-4aa4-acea-96cefb593807-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.893612 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894113 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe93f39-887c-4949-9e78-1047998f8aff" containerName="keystone-bootstrap" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe93f39-887c-4949-9e78-1047998f8aff" containerName="keystone-bootstrap" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894166 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebac6d9-df0f-41fe-bc73-8236847ff237" containerName="placement-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894175 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebac6d9-df0f-41fe-bc73-8236847ff237" containerName="placement-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894188 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846c118f-23c1-402f-8747-633485e743c9" containerName="barbican-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894196 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="846c118f-23c1-402f-8747-633485e743c9" containerName="barbican-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894214 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="dnsmasq-dns" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894222 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="dnsmasq-dns" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894251 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="init" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894259 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="init" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894277 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-log" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894284 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-log" Jan 22 14:04:36 crc kubenswrapper[4743]: E0122 14:04:36.894297 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-httpd" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894305 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-httpd" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894522 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="846c118f-23c1-402f-8747-633485e743c9" containerName="barbican-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894555 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe93f39-887c-4949-9e78-1047998f8aff" containerName="keystone-bootstrap" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894568 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-log" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894582 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" containerName="dnsmasq-dns" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894603 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" containerName="glance-httpd" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.894622 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ebac6d9-df0f-41fe-bc73-8236847ff237" containerName="placement-db-sync" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.897086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.899138 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.902105 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.929177 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.952321 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.996444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.996493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.996522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.996571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vt6\" (UniqueName: \"kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.996600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.997380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.997410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:36 crc kubenswrapper[4743]: I0122 14:04:36.997436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.002316 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.081451 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.101847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.101924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.101954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102006 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vt6\" (UniqueName: \"kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102283 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.102817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.104061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: W0122 14:04:37.105403 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde2c5f93_9ee3_4723_8123_bd48d5385423.slice/crio-06ab1b8cdf15dd2c1398403a1308fc09babef387e4655f4d02634938d466194a WatchSource:0}: Error finding container 06ab1b8cdf15dd2c1398403a1308fc09babef387e4655f4d02634938d466194a: Status 404 returned error can't find the container with id 06ab1b8cdf15dd2c1398403a1308fc09babef387e4655f4d02634938d466194a Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.106969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.107125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.120627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vt6\" (UniqueName: \"kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.127139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.128313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.161412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.227066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.595499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerStarted","Data":"afbba5dcab408eeada4700d4829489737116d02c99d0246230ad10478ef11325"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.603983 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.605503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.611822 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vkgkb" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.612034 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.617286 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.650851 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f47b7b66b-mfhcg"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.652066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.660210 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.660497 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.660605 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.660701 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.661706 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-48lf7" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.662022 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.670025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" event={"ID":"457d7aae-7205-4d41-bf26-e0defaa8d96c","Type":"ContainerStarted","Data":"50eb96cd11787c025e902132666b14a37541199f47655a348bbdfafda4338337"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.691371 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.691879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerStarted","Data":"06ab1b8cdf15dd2c1398403a1308fc09babef387e4655f4d02634938d466194a"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.713644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerStarted","Data":"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.719688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerStarted","Data":"9fe88db94dc3b2ab9c18169846d9b1cbf71d9b4f9bf6fd1cf8ec87a1afbf44c0"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.720643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.720702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.720898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.721056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg5lf\" (UniqueName: \"kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.721075 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.728235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f47b7b66b-mfhcg"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.753406 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.787479 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b50d81f-9274-4e49-b27c-0452022c096a" path="/var/lib/kubelet/pods/1b50d81f-9274-4e49-b27c-0452022c096a/volumes" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.788540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mdc9s" event={"ID":"f840eb1a-7e3e-4aa4-acea-96cefb593807","Type":"ContainerDied","Data":"93f9542514b86f7e1bd7e9465e864555f60b686cbfa10de04c0a47377b055904"} Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.788576 4743 scope.go:117] "RemoveContainer" containerID="139e1574369316387845d4206537c6c69143f809af2db271d28f8be12d0752f4" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.807073 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.809310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.813468 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.823113 4743 scope.go:117] "RemoveContainer" containerID="baa47e0a3805a9c63578e2972a3d09cd307563f0f73e9883289ba252adace558" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824573 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjhc\" (UniqueName: \"kubernetes.io/projected/cd3df106-ec34-42ad-bf5d-f963b9bb0871-kube-api-access-gxjhc\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-combined-ca-bundle\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-scripts\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824749 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-config-data\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg5lf\" (UniqueName: \"kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824822 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-internal-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-public-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-fernet-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.824903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-credential-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.827301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.835640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.841914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.846277 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.847124 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg5lf\" (UniqueName: \"kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.847284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom\") pod \"barbican-worker-665544959-z46xh\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.855873 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-77bd86cd86-kqp9m"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.857617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.864363 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.864376 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.864449 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pm9nn" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.864491 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.867905 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77bd86cd86-kqp9m"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.868351 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.918145 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.930941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9315e9cf-2a73-482e-810e-8fd19202915f-logs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.930985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-scripts\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j78z\" (UniqueName: \"kubernetes.io/projected/9315e9cf-2a73-482e-810e-8fd19202915f-kube-api-access-7j78z\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-config-data\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931149 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-internal-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-public-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-internal-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-scripts\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-public-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-fernet-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-credential-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931303 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-combined-ca-bundle\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjhc\" (UniqueName: \"kubernetes.io/projected/cd3df106-ec34-42ad-bf5d-f963b9bb0871-kube-api-access-gxjhc\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-config-data\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgrm\" (UniqueName: \"kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.931412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-combined-ca-bundle\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.941202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-fernet-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.945408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-public-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.951179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-scripts\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.951503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-internal-tls-certs\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.953455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-credential-keys\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.953536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-config-data\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.953916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3df106-ec34-42ad-bf5d-f963b9bb0871-combined-ca-bundle\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.967634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.968320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjhc\" (UniqueName: \"kubernetes.io/projected/cd3df106-ec34-42ad-bf5d-f963b9bb0871-kube-api-access-gxjhc\") pod \"keystone-5f47b7b66b-mfhcg\" (UID: \"cd3df106-ec34-42ad-bf5d-f963b9bb0871\") " pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.969189 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.974954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.978042 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.979853 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.981433 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 22 14:04:37 crc kubenswrapper[4743]: I0122 14:04:37.986801 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.019976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.022876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-internal-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-scripts\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-public-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-combined-ca-bundle\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-config-data\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgrm\" (UniqueName: \"kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9315e9cf-2a73-482e-810e-8fd19202915f-logs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j78z\" (UniqueName: \"kubernetes.io/projected/9315e9cf-2a73-482e-810e-8fd19202915f-kube-api-access-7j78z\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.033725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.035843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9315e9cf-2a73-482e-810e-8fd19202915f-logs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.035902 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.038450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-combined-ca-bundle\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.042185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-config-data\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.045897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-scripts\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.050587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-internal-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.051432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.056903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j78z\" (UniqueName: \"kubernetes.io/projected/9315e9cf-2a73-482e-810e-8fd19202915f-kube-api-access-7j78z\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.059392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.061171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgrm\" (UniqueName: \"kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.074029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data\") pod \"barbican-keystone-listener-5774494bd8-6dt7x\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.080126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9315e9cf-2a73-482e-810e-8fd19202915f-public-tls-certs\") pod \"placement-77bd86cd86-kqp9m\" (UID: \"9315e9cf-2a73-482e-810e-8fd19202915f\") " pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.082709 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mdc9s"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.113428 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c88b6769d-nzzc6"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.115461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.123381 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8448f7b79-pndf8"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.126246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137267 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137303 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpngr\" (UniqueName: \"kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdnx\" (UniqueName: \"kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.137500 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.144036 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c88b6769d-nzzc6"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.145149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.156422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8448f7b79-pndf8"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.190626 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.192105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.199239 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.203510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.218666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.243631 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.243702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsp5q\" (UniqueName: \"kubernetes.io/projected/f254cb75-db18-488e-886f-544f0b8a8516-kube-api-access-lsp5q\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.243727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwn6h\" (UniqueName: \"kubernetes.io/projected/a84fcd7a-0eac-4d23-832e-e632bd4f971f-kube-api-access-vwn6h\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data-custom\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpngr\" (UniqueName: \"kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-combined-ca-bundle\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdnx\" (UniqueName: \"kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.244325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245140 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data-custom\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84fcd7a-0eac-4d23-832e-e632bd4f971f-logs\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f254cb75-db18-488e-886f-544f0b8a8516-logs\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.245373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-combined-ca-bundle\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.251418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.252156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.252844 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.253393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.254221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.254617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.254928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.255428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.270274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.280544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdnx\" (UniqueName: \"kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx\") pod \"dnsmasq-dns-85ff748b95-8kx6w\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.283174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpngr\" (UniqueName: \"kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr\") pod \"barbican-api-d867cf67d-79hj9\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.323050 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84fcd7a-0eac-4d23-832e-e632bd4f971f-logs\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f254cb75-db18-488e-886f-544f0b8a8516-logs\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-combined-ca-bundle\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsp5q\" (UniqueName: \"kubernetes.io/projected/f254cb75-db18-488e-886f-544f0b8a8516-kube-api-access-lsp5q\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwn6h\" (UniqueName: \"kubernetes.io/projected/a84fcd7a-0eac-4d23-832e-e632bd4f971f-kube-api-access-vwn6h\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data-custom\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-combined-ca-bundle\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352861 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data-custom\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.352906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m67tq\" (UniqueName: \"kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.354389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84fcd7a-0eac-4d23-832e-e632bd4f971f-logs\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.354735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f254cb75-db18-488e-886f-544f0b8a8516-logs\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.358023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-combined-ca-bundle\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.359846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.372325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.372692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-config-data-custom\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.372838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f254cb75-db18-488e-886f-544f0b8a8516-combined-ca-bundle\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.374922 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a84fcd7a-0eac-4d23-832e-e632bd4f971f-config-data-custom\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.376823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwn6h\" (UniqueName: \"kubernetes.io/projected/a84fcd7a-0eac-4d23-832e-e632bd4f971f-kube-api-access-vwn6h\") pod \"barbican-keystone-listener-6c88b6769d-nzzc6\" (UID: \"a84fcd7a-0eac-4d23-832e-e632bd4f971f\") " pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.379325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsp5q\" (UniqueName: \"kubernetes.io/projected/f254cb75-db18-488e-886f-544f0b8a8516-kube-api-access-lsp5q\") pod \"barbican-worker-8448f7b79-pndf8\" (UID: \"f254cb75-db18-488e-886f-544f0b8a8516\") " pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.389650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.450417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8448f7b79-pndf8" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.455456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.455564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.455609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m67tq\" (UniqueName: \"kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.455688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.456042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.457693 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.461752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.471304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.478398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.481336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.491321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m67tq\" (UniqueName: \"kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq\") pod \"barbican-api-59d596989d-xmgwp\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.513573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.657110 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f47b7b66b-mfhcg"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.799096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerStarted","Data":"cbcaa6d5f04c83ed4dd21bb53079e388e02f2bff5fed9a9ac0baa4452e704d7b"} Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.799148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerStarted","Data":"1ee883debc82e7cb2aaeee5362b21ea7a1cdd7d98662ff17df82b291aa89fe64"} Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.821695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerStarted","Data":"aa612fc06777abc239bb03ac9abf35e72999a6728284582ce589f9faacf6122a"} Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.854973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.880856 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77bd86cd86-kqp9m"] Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.889170 4743 generic.go:334] "Generic (PLEG): container finished" podID="457d7aae-7205-4d41-bf26-e0defaa8d96c" containerID="fa9c223f5488aee0a6d2afbba3f26b506024e408298d6cffcae140721cb03eea" exitCode=0 Jan 22 14:04:38 crc kubenswrapper[4743]: I0122 14:04:38.889264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" event={"ID":"457d7aae-7205-4d41-bf26-e0defaa8d96c","Type":"ContainerDied","Data":"fa9c223f5488aee0a6d2afbba3f26b506024e408298d6cffcae140721cb03eea"} Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:38.997509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerStarted","Data":"1ae6a9f97c3b7fc665caf057bbe7901477be00bed6bc5e379b53f6b4f6925f62"} Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.060715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerStarted","Data":"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913"} Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.084992 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.333112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.417909 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c88b6769d-nzzc6"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.696454 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.794719 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f840eb1a-7e3e-4aa4-acea-96cefb593807" path="/var/lib/kubelet/pods/f840eb1a-7e3e-4aa4-acea-96cefb593807/volumes" Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.866097 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.904450 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8448f7b79-pndf8"] Jan 22 14:04:39 crc kubenswrapper[4743]: I0122 14:04:39.969754 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.111767 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.111951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.112059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.112092 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wcx8\" (UniqueName: \"kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.112123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.112164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb\") pod \"457d7aae-7205-4d41-bf26-e0defaa8d96c\" (UID: \"457d7aae-7205-4d41-bf26-e0defaa8d96c\") " Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.127292 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8" (OuterVolumeSpecName: "kube-api-access-2wcx8") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "kube-api-access-2wcx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.141912 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.157341 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.167573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerStarted","Data":"5f7a66d3543092b4d04f2117d97334090d6c4014db53c439cd7a86a17eab5075"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.172245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.175581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerStarted","Data":"f3053221a5c4225b8ab8bbcfe202970a3280bafc0b4d9adbc95c42398e2680ad"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.180403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerStarted","Data":"72b448bad62e3aebc5e2ef3477478535c7413d29992ed9b309560ebf787502b8"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.182046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8448f7b79-pndf8" event={"ID":"f254cb75-db18-488e-886f-544f0b8a8516","Type":"ContainerStarted","Data":"07b12fd481e049c6648e86fccac909662c181a11b506bfa82524579fc9b35d2f"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.195702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" event={"ID":"457d7aae-7205-4d41-bf26-e0defaa8d96c","Type":"ContainerDied","Data":"50eb96cd11787c025e902132666b14a37541199f47655a348bbdfafda4338337"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.195741 4743 scope.go:117] "RemoveContainer" containerID="fa9c223f5488aee0a6d2afbba3f26b506024e408298d6cffcae140721cb03eea" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.195864 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rsqh8" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.196099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config" (OuterVolumeSpecName: "config") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.197376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerStarted","Data":"9b2bddac3919047ed76502731a88ad89c436782c82e311858542d6aa3ea8c7a9"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.199464 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "457d7aae-7205-4d41-bf26-e0defaa8d96c" (UID: "457d7aae-7205-4d41-bf26-e0defaa8d96c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.204395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" event={"ID":"a84fcd7a-0eac-4d23-832e-e632bd4f971f","Type":"ContainerStarted","Data":"8ae84dd566f3a9583899612a25eb14069eff760d84e40850db8a4ec1164483c4"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.208919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f47b7b66b-mfhcg" event={"ID":"cd3df106-ec34-42ad-bf5d-f963b9bb0871","Type":"ContainerStarted","Data":"bf9daeb3a9b7bc31fd69e20e50b03e9200a61c5e70e5c40b3337b89f22033e4e"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.208995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f47b7b66b-mfhcg" event={"ID":"cd3df106-ec34-42ad-bf5d-f963b9bb0871","Type":"ContainerStarted","Data":"a1aa0332d7dbd3b78bc7b728846574a0edb37ec155c8de2044d393d3e060a5d5"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.210203 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.222818 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wcx8\" (UniqueName: \"kubernetes.io/projected/457d7aae-7205-4d41-bf26-e0defaa8d96c-kube-api-access-2wcx8\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.222842 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.222857 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.225514 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.225526 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.225535 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/457d7aae-7205-4d41-bf26-e0defaa8d96c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.224512 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerID="d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f" exitCode=0 Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.224542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" event={"ID":"ddd12850-b0cc-4119-9ba4-bf5a893f41a7","Type":"ContainerDied","Data":"d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.225694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" event={"ID":"ddd12850-b0cc-4119-9ba4-bf5a893f41a7","Type":"ContainerStarted","Data":"7f7cb657e0f6bf06518eaa7a3ca82447446c5bf67ab8fdf1eec55c57045171a1"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.228556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerStarted","Data":"ef9ffc5cb4c24c27363689c60af953e2196889a603ec04ab66b8826de1d12f2b"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.229616 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.239312 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5f47b7b66b-mfhcg" podStartSLOduration=3.239292763 podStartE2EDuration="3.239292763s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:40.230740363 +0000 UTC m=+1116.785783536" watchObservedRunningTime="2026-01-22 14:04:40.239292763 +0000 UTC m=+1116.794335916" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.244083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77bd86cd86-kqp9m" event={"ID":"9315e9cf-2a73-482e-810e-8fd19202915f","Type":"ContainerStarted","Data":"3a2a26c2ee71a89d697273d54014a4fab981c6b608f596b241c25b505d7697bd"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.244125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77bd86cd86-kqp9m" event={"ID":"9315e9cf-2a73-482e-810e-8fd19202915f","Type":"ContainerStarted","Data":"61f2c071184245f023bdbc5d9b1023bd8135237f7aed59e7e068f78bb05590a1"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.251232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerStarted","Data":"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b"} Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.251267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.267917 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bd7ccdcfb-sx7wp" podStartSLOduration=9.267898121 podStartE2EDuration="9.267898121s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:40.256978659 +0000 UTC m=+1116.812021832" watchObservedRunningTime="2026-01-22 14:04:40.267898121 +0000 UTC m=+1116.822941284" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.352221 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.356582 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56bdc765bf-xnb2x" podStartSLOduration=7.356559217 podStartE2EDuration="7.356559217s" podCreationTimestamp="2026-01-22 14:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:40.352110362 +0000 UTC m=+1116.907153525" watchObservedRunningTime="2026-01-22 14:04:40.356559217 +0000 UTC m=+1116.911602380" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.449645 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b7fb54dc6-5q9jf" podUID="e452af10-fc11-4854-bf38-8a90856331d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.593693 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:40 crc kubenswrapper[4743]: I0122 14:04:40.642517 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rsqh8"] Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.271775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerStarted","Data":"8c2c3d5a83b5abe37a3429496fdd0fec5740acca2b0579cb4676e495ab480596"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.272123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerStarted","Data":"d1f4bd6b3bf749bc6483bd500388bb3aae9ca7cd5e3947067e454bed02ca9f94"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.273327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.273371 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.283653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77bd86cd86-kqp9m" event={"ID":"9315e9cf-2a73-482e-810e-8fd19202915f","Type":"ContainerStarted","Data":"b538a4afbda0d6b112575a94c1ef0f90657715224710f4597d2ba0abef91dd22"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.283822 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.283858 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.298509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerStarted","Data":"1914d432dbf22013451fd6d692ceaf77a94a8b00223aa1c44f08e68d9446457e"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.298905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.299001 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.308997 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d867cf67d-79hj9" podStartSLOduration=4.308975984 podStartE2EDuration="4.308975984s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:41.294904581 +0000 UTC m=+1117.849947744" watchObservedRunningTime="2026-01-22 14:04:41.308975984 +0000 UTC m=+1117.864019147" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.312189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" event={"ID":"ddd12850-b0cc-4119-9ba4-bf5a893f41a7","Type":"ContainerStarted","Data":"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.312676 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.352230 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-77bd86cd86-kqp9m" podStartSLOduration=4.352212109 podStartE2EDuration="4.352212109s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:41.320259475 +0000 UTC m=+1117.875302638" watchObservedRunningTime="2026-01-22 14:04:41.352212109 +0000 UTC m=+1117.907255272" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.355651 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" podStartSLOduration=4.355637567 podStartE2EDuration="4.355637567s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:41.342145999 +0000 UTC m=+1117.897189192" watchObservedRunningTime="2026-01-22 14:04:41.355637567 +0000 UTC m=+1117.910680740" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.378226 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59d596989d-xmgwp" podStartSLOduration=3.378207449 podStartE2EDuration="3.378207449s" podCreationTimestamp="2026-01-22 14:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:41.368432507 +0000 UTC m=+1117.923475670" watchObservedRunningTime="2026-01-22 14:04:41.378207449 +0000 UTC m=+1117.933250612" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.386156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerStarted","Data":"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02"} Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.422971 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.422947252 podStartE2EDuration="10.422947252s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:41.413092998 +0000 UTC m=+1117.968136191" watchObservedRunningTime="2026-01-22 14:04:41.422947252 +0000 UTC m=+1117.977990425" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.462885 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.489726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64fcd75458-9rzfr"] Jan 22 14:04:41 crc kubenswrapper[4743]: E0122 14:04:41.490227 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457d7aae-7205-4d41-bf26-e0defaa8d96c" containerName="init" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.490245 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="457d7aae-7205-4d41-bf26-e0defaa8d96c" containerName="init" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.490437 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="457d7aae-7205-4d41-bf26-e0defaa8d96c" containerName="init" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.496283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.502369 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.502547 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.508999 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64fcd75458-9rzfr"] Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.577742 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data-custom\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.577930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndr8x\" (UniqueName: \"kubernetes.io/projected/c4db7649-d1b0-47c2-b5e4-34a552ccee79-kube-api-access-ndr8x\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.577987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4db7649-d1b0-47c2-b5e4-34a552ccee79-logs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.578015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-internal-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.578042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.578095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-public-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.578129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-combined-ca-bundle\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.679974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data-custom\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndr8x\" (UniqueName: \"kubernetes.io/projected/c4db7649-d1b0-47c2-b5e4-34a552ccee79-kube-api-access-ndr8x\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680158 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4db7649-d1b0-47c2-b5e4-34a552ccee79-logs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-internal-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680251 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-public-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.680277 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-combined-ca-bundle\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.681701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4db7649-d1b0-47c2-b5e4-34a552ccee79-logs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.687882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-combined-ca-bundle\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.687894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data-custom\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.688290 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-public-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.688696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-internal-tls-certs\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.692108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db7649-d1b0-47c2-b5e4-34a552ccee79-config-data\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.720364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndr8x\" (UniqueName: \"kubernetes.io/projected/c4db7649-d1b0-47c2-b5e4-34a552ccee79-kube-api-access-ndr8x\") pod \"barbican-api-64fcd75458-9rzfr\" (UID: \"c4db7649-d1b0-47c2-b5e4-34a552ccee79\") " pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.786073 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457d7aae-7205-4d41-bf26-e0defaa8d96c" path="/var/lib/kubelet/pods/457d7aae-7205-4d41-bf26-e0defaa8d96c/volumes" Jan 22 14:04:41 crc kubenswrapper[4743]: I0122 14:04:41.852674 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.089538 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.089845 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.148563 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.221555 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.403893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerStarted","Data":"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7"} Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.409641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerStarted","Data":"2306c268bbd8a02a9ba0e62cd3ea6575a384efe9651061abb1bfb85a27b2c6b6"} Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.410971 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.412426 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.435886 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.435868928 podStartE2EDuration="6.435868928s" podCreationTimestamp="2026-01-22 14:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:42.430197152 +0000 UTC m=+1118.985240325" watchObservedRunningTime="2026-01-22 14:04:42.435868928 +0000 UTC m=+1118.990912091" Jan 22 14:04:42 crc kubenswrapper[4743]: I0122 14:04:42.491011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64fcd75458-9rzfr"] Jan 22 14:04:43 crc kubenswrapper[4743]: W0122 14:04:43.066235 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4db7649_d1b0_47c2_b5e4_34a552ccee79.slice/crio-ee039fb0076a6ecfd3630a4d4ca377370e6850469eb96389dbd27c28c1a55944 WatchSource:0}: Error finding container ee039fb0076a6ecfd3630a4d4ca377370e6850469eb96389dbd27c28c1a55944: Status 404 returned error can't find the container with id ee039fb0076a6ecfd3630a4d4ca377370e6850469eb96389dbd27c28c1a55944 Jan 22 14:04:43 crc kubenswrapper[4743]: I0122 14:04:43.419420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-srjxw" event={"ID":"eb22345c-594c-46a3-b362-e34baa8f271c","Type":"ContainerStarted","Data":"795289613c94db2b8fe8ac8e26657900a12882529e4bcc81dddcdcda9648ea1d"} Jan 22 14:04:43 crc kubenswrapper[4743]: I0122 14:04:43.423351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64fcd75458-9rzfr" event={"ID":"c4db7649-d1b0-47c2-b5e4-34a552ccee79","Type":"ContainerStarted","Data":"ee039fb0076a6ecfd3630a4d4ca377370e6850469eb96389dbd27c28c1a55944"} Jan 22 14:04:43 crc kubenswrapper[4743]: I0122 14:04:43.423627 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d867cf67d-79hj9" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api-log" containerID="cri-o://d1f4bd6b3bf749bc6483bd500388bb3aae9ca7cd5e3947067e454bed02ca9f94" gracePeriod=30 Jan 22 14:04:43 crc kubenswrapper[4743]: I0122 14:04:43.423731 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d867cf67d-79hj9" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api" containerID="cri-o://8c2c3d5a83b5abe37a3429496fdd0fec5740acca2b0579cb4676e495ab480596" gracePeriod=30 Jan 22 14:04:43 crc kubenswrapper[4743]: I0122 14:04:43.449147 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-srjxw" podStartSLOduration=5.661394744 podStartE2EDuration="52.449125804s" podCreationTimestamp="2026-01-22 14:03:51 +0000 UTC" firstStartedPulling="2026-01-22 14:03:54.911619069 +0000 UTC m=+1071.466662232" lastFinishedPulling="2026-01-22 14:04:41.699350129 +0000 UTC m=+1118.254393292" observedRunningTime="2026-01-22 14:04:43.439337871 +0000 UTC m=+1119.994381044" watchObservedRunningTime="2026-01-22 14:04:43.449125804 +0000 UTC m=+1120.004168967" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.438870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8448f7b79-pndf8" event={"ID":"f254cb75-db18-488e-886f-544f0b8a8516","Type":"ContainerStarted","Data":"88a312454bb3986e1c41ea82885c1b987e00259aafa29f359a3f459ecbe88396"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.442027 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8448f7b79-pndf8" event={"ID":"f254cb75-db18-488e-886f-544f0b8a8516","Type":"ContainerStarted","Data":"ce3ba08b7d1d286fdbed9aa12d8ab5f3910e7b3f4d467d4b4ccddbb48bf27703"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.445396 4743 generic.go:334] "Generic (PLEG): container finished" podID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerID="8c2c3d5a83b5abe37a3429496fdd0fec5740acca2b0579cb4676e495ab480596" exitCode=0 Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.445423 4743 generic.go:334] "Generic (PLEG): container finished" podID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerID="d1f4bd6b3bf749bc6483bd500388bb3aae9ca7cd5e3947067e454bed02ca9f94" exitCode=143 Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.445460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerDied","Data":"8c2c3d5a83b5abe37a3429496fdd0fec5740acca2b0579cb4676e495ab480596"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.445481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerDied","Data":"d1f4bd6b3bf749bc6483bd500388bb3aae9ca7cd5e3947067e454bed02ca9f94"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.449123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" event={"ID":"a84fcd7a-0eac-4d23-832e-e632bd4f971f","Type":"ContainerStarted","Data":"c51e77b48285f2911199917e4d9d37d6995118c4a5d57d919a3d7c1a468ab583"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.449175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" event={"ID":"a84fcd7a-0eac-4d23-832e-e632bd4f971f","Type":"ContainerStarted","Data":"572622d75e7b240372089b8093e42dc681426e51483791c4820f621ff72d4b7e"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.459396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8448f7b79-pndf8" podStartSLOduration=4.226533908 podStartE2EDuration="7.459379522s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="2026-01-22 14:04:39.928986522 +0000 UTC m=+1116.484029685" lastFinishedPulling="2026-01-22 14:04:43.161832136 +0000 UTC m=+1119.716875299" observedRunningTime="2026-01-22 14:04:44.458070378 +0000 UTC m=+1121.013113541" watchObservedRunningTime="2026-01-22 14:04:44.459379522 +0000 UTC m=+1121.014422685" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.462905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerStarted","Data":"0a29bf0c08cff52754f4d7db81bd5cd48b9e8bc58a0e2f597b17c43d9fac0d1a"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.462940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerStarted","Data":"a7ce03f23852e506dc7e45e8492edf2520c808025e61af270d4d1ef7ca5dd529"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.471706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64fcd75458-9rzfr" event={"ID":"c4db7649-d1b0-47c2-b5e4-34a552ccee79","Type":"ContainerStarted","Data":"653fd5a9f249336c8024944f108c47a8f4cc3aa55f73b4a5d4b4b70883c0dc74"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.471768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64fcd75458-9rzfr" event={"ID":"c4db7649-d1b0-47c2-b5e4-34a552ccee79","Type":"ContainerStarted","Data":"538edf970a86301ca993b4125ba22c0403849dadd51dc3efe3fa2c508fc56f22"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.472637 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.472664 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.490291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerStarted","Data":"e2bf69741d6eba79e58f8d45e7263a4a86646748b9354326504711e241a2a596"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.490655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerStarted","Data":"40a1b5bf70330c8a1ed129bafb0e99705e6da790c0a3711230914b5e5b2cad28"} Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.495963 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c88b6769d-nzzc6" podStartSLOduration=3.840876794 podStartE2EDuration="7.495943115s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="2026-01-22 14:04:39.501024337 +0000 UTC m=+1116.056067500" lastFinishedPulling="2026-01-22 14:04:43.156090658 +0000 UTC m=+1119.711133821" observedRunningTime="2026-01-22 14:04:44.480731712 +0000 UTC m=+1121.035774875" watchObservedRunningTime="2026-01-22 14:04:44.495943115 +0000 UTC m=+1121.050986278" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.539169 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.562718 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-665544959-z46xh" podStartSLOduration=3.469313803 podStartE2EDuration="7.562704646s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="2026-01-22 14:04:39.060837097 +0000 UTC m=+1115.615880250" lastFinishedPulling="2026-01-22 14:04:43.15422793 +0000 UTC m=+1119.709271093" observedRunningTime="2026-01-22 14:04:44.51284715 +0000 UTC m=+1121.067890313" watchObservedRunningTime="2026-01-22 14:04:44.562704646 +0000 UTC m=+1121.117747809" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.563059 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.595586 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64fcd75458-9rzfr" podStartSLOduration=3.595568643 podStartE2EDuration="3.595568643s" podCreationTimestamp="2026-01-22 14:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:44.54539413 +0000 UTC m=+1121.100437293" watchObservedRunningTime="2026-01-22 14:04:44.595568643 +0000 UTC m=+1121.150611806" Jan 22 14:04:44 crc kubenswrapper[4743]: I0122 14:04:44.639379 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" podStartSLOduration=3.647113897 podStartE2EDuration="7.639360712s" podCreationTimestamp="2026-01-22 14:04:37 +0000 UTC" firstStartedPulling="2026-01-22 14:04:39.159806229 +0000 UTC m=+1115.714849402" lastFinishedPulling="2026-01-22 14:04:43.152053054 +0000 UTC m=+1119.707096217" observedRunningTime="2026-01-22 14:04:44.563739523 +0000 UTC m=+1121.118782696" watchObservedRunningTime="2026-01-22 14:04:44.639360712 +0000 UTC m=+1121.194403875" Jan 22 14:04:45 crc kubenswrapper[4743]: I0122 14:04:45.924591 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:46 crc kubenswrapper[4743]: I0122 14:04:46.509143 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-665544959-z46xh" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker-log" containerID="cri-o://a7ce03f23852e506dc7e45e8492edf2520c808025e61af270d4d1ef7ca5dd529" gracePeriod=30 Jan 22 14:04:46 crc kubenswrapper[4743]: I0122 14:04:46.509229 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-665544959-z46xh" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker" containerID="cri-o://0a29bf0c08cff52754f4d7db81bd5cd48b9e8bc58a0e2f597b17c43d9fac0d1a" gracePeriod=30 Jan 22 14:04:46 crc kubenswrapper[4743]: I0122 14:04:46.509647 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener-log" containerID="cri-o://40a1b5bf70330c8a1ed129bafb0e99705e6da790c0a3711230914b5e5b2cad28" gracePeriod=30 Jan 22 14:04:46 crc kubenswrapper[4743]: I0122 14:04:46.509735 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener" containerID="cri-o://e2bf69741d6eba79e58f8d45e7263a4a86646748b9354326504711e241a2a596" gracePeriod=30 Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.228372 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.228425 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.256436 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.283758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.522993 4743 generic.go:334] "Generic (PLEG): container finished" podID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerID="a7ce03f23852e506dc7e45e8492edf2520c808025e61af270d4d1ef7ca5dd529" exitCode=143 Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.523511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerDied","Data":"a7ce03f23852e506dc7e45e8492edf2520c808025e61af270d4d1ef7ca5dd529"} Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.530888 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerID="e2bf69741d6eba79e58f8d45e7263a4a86646748b9354326504711e241a2a596" exitCode=0 Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.530931 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerID="40a1b5bf70330c8a1ed129bafb0e99705e6da790c0a3711230914b5e5b2cad28" exitCode=143 Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.530985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerDied","Data":"e2bf69741d6eba79e58f8d45e7263a4a86646748b9354326504711e241a2a596"} Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.531595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerDied","Data":"40a1b5bf70330c8a1ed129bafb0e99705e6da790c0a3711230914b5e5b2cad28"} Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.531621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 22 14:04:47 crc kubenswrapper[4743]: I0122 14:04:47.531634 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.054517 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.325003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.398554 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.399022 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-kvm68" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="dnsmasq-dns" containerID="cri-o://75992c52a7511e1d2053dba8c589a83081fd8a269b9b752a3790b2fda37df1b0" gracePeriod=10 Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.578673 4743 generic.go:334] "Generic (PLEG): container finished" podID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerID="0a29bf0c08cff52754f4d7db81bd5cd48b9e8bc58a0e2f597b17c43d9fac0d1a" exitCode=0 Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.579206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerDied","Data":"0a29bf0c08cff52754f4d7db81bd5cd48b9e8bc58a0e2f597b17c43d9fac0d1a"} Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.598317 4743 generic.go:334] "Generic (PLEG): container finished" podID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerID="75992c52a7511e1d2053dba8c589a83081fd8a269b9b752a3790b2fda37df1b0" exitCode=0 Jan 22 14:04:48 crc kubenswrapper[4743]: I0122 14:04:48.598385 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kvm68" event={"ID":"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe","Type":"ContainerDied","Data":"75992c52a7511e1d2053dba8c589a83081fd8a269b9b752a3790b2fda37df1b0"} Jan 22 14:04:49 crc kubenswrapper[4743]: I0122 14:04:49.561114 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kvm68" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Jan 22 14:04:50 crc kubenswrapper[4743]: I0122 14:04:50.333227 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 22 14:04:50 crc kubenswrapper[4743]: I0122 14:04:50.333619 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:04:50 crc kubenswrapper[4743]: I0122 14:04:50.346234 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 22 14:04:50 crc kubenswrapper[4743]: I0122 14:04:50.352727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 22 14:04:50 crc kubenswrapper[4743]: I0122 14:04:50.446589 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b7fb54dc6-5q9jf" podUID="e452af10-fc11-4854-bf38-8a90856331d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 22 14:04:51 crc kubenswrapper[4743]: I0122 14:04:51.044113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:51 crc kubenswrapper[4743]: I0122 14:04:51.115377 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:52 crc kubenswrapper[4743]: I0122 14:04:52.952375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.099565 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom\") pod \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.099630 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data\") pod \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.099673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle\") pod \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.099756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpngr\" (UniqueName: \"kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr\") pod \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.099818 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs\") pod \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\" (UID: \"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.100609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs" (OuterVolumeSpecName: "logs") pod "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" (UID: "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.106407 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" (UID: "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.109630 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr" (OuterVolumeSpecName: "kube-api-access-xpngr") pod "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" (UID: "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b"). InnerVolumeSpecName "kube-api-access-xpngr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.155464 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" (UID: "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.169917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data" (OuterVolumeSpecName: "config-data") pod "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" (UID: "53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: E0122 14:04:53.191237 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.206928 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.206960 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpngr\" (UniqueName: \"kubernetes.io/projected/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-kube-api-access-xpngr\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.206973 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.206984 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.206994 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.339867 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.411168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs\") pod \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.411219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom\") pod \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.411243 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data\") pod \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.411294 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wgrm\" (UniqueName: \"kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm\") pod \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.411314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle\") pod \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\" (UID: \"ec33de7c-5eab-46d0-a702-af5fbd2ebe50\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.412722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs" (OuterVolumeSpecName: "logs") pod "ec33de7c-5eab-46d0-a702-af5fbd2ebe50" (UID: "ec33de7c-5eab-46d0-a702-af5fbd2ebe50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.416901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm" (OuterVolumeSpecName: "kube-api-access-9wgrm") pod "ec33de7c-5eab-46d0-a702-af5fbd2ebe50" (UID: "ec33de7c-5eab-46d0-a702-af5fbd2ebe50"). InnerVolumeSpecName "kube-api-access-9wgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.416974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec33de7c-5eab-46d0-a702-af5fbd2ebe50" (UID: "ec33de7c-5eab-46d0-a702-af5fbd2ebe50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.479256 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec33de7c-5eab-46d0-a702-af5fbd2ebe50" (UID: "ec33de7c-5eab-46d0-a702-af5fbd2ebe50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.485206 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d867cf67d-79hj9" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.485232 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d867cf67d-79hj9" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.509156 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data" (OuterVolumeSpecName: "config-data") pod "ec33de7c-5eab-46d0-a702-af5fbd2ebe50" (UID: "ec33de7c-5eab-46d0-a702-af5fbd2ebe50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.513526 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wgrm\" (UniqueName: \"kubernetes.io/projected/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-kube-api-access-9wgrm\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.513562 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.513577 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.513587 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.513597 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec33de7c-5eab-46d0-a702-af5fbd2ebe50-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.539693 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.602737 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb\") pod \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config\") pod \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615813 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg5lf\" (UniqueName: \"kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf\") pod \"d95db954-8e59-44ac-ae17-788b0fbcb177\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615852 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb\") pod \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615932 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmzsm\" (UniqueName: \"kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm\") pod \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.615986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs\") pod \"d95db954-8e59-44ac-ae17-788b0fbcb177\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.616022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom\") pod \"d95db954-8e59-44ac-ae17-788b0fbcb177\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.616059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle\") pod \"d95db954-8e59-44ac-ae17-788b0fbcb177\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.616086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data\") pod \"d95db954-8e59-44ac-ae17-788b0fbcb177\" (UID: \"d95db954-8e59-44ac-ae17-788b0fbcb177\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.616109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc\") pod \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\" (UID: \"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe\") " Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.618654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs" (OuterVolumeSpecName: "logs") pod "d95db954-8e59-44ac-ae17-788b0fbcb177" (UID: "d95db954-8e59-44ac-ae17-788b0fbcb177"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.624420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm" (OuterVolumeSpecName: "kube-api-access-zmzsm") pod "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" (UID: "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe"). InnerVolumeSpecName "kube-api-access-zmzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.633664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d95db954-8e59-44ac-ae17-788b0fbcb177" (UID: "d95db954-8e59-44ac-ae17-788b0fbcb177"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.638258 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf" (OuterVolumeSpecName: "kube-api-access-lg5lf") pod "d95db954-8e59-44ac-ae17-788b0fbcb177" (UID: "d95db954-8e59-44ac-ae17-788b0fbcb177"). InnerVolumeSpecName "kube-api-access-lg5lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.657849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-srjxw" event={"ID":"eb22345c-594c-46a3-b362-e34baa8f271c","Type":"ContainerDied","Data":"795289613c94db2b8fe8ac8e26657900a12882529e4bcc81dddcdcda9648ea1d"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.658156 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb22345c-594c-46a3-b362-e34baa8f271c" containerID="795289613c94db2b8fe8ac8e26657900a12882529e4bcc81dddcdcda9648ea1d" exitCode=0 Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.666018 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d867cf67d-79hj9" event={"ID":"53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b","Type":"ContainerDied","Data":"9b2bddac3919047ed76502731a88ad89c436782c82e311858542d6aa3ea8c7a9"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.666067 4743 scope.go:117] "RemoveContainer" containerID="8c2c3d5a83b5abe37a3429496fdd0fec5740acca2b0579cb4676e495ab480596" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.666119 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d867cf67d-79hj9" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.697828 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-665544959-z46xh" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.697897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-665544959-z46xh" event={"ID":"d95db954-8e59-44ac-ae17-788b0fbcb177","Type":"ContainerDied","Data":"5f7a66d3543092b4d04f2117d97334090d6c4014db53c439cd7a86a17eab5075"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.711043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d95db954-8e59-44ac-ae17-788b0fbcb177" (UID: "d95db954-8e59-44ac-ae17-788b0fbcb177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.715926 4743 scope.go:117] "RemoveContainer" containerID="d1f4bd6b3bf749bc6483bd500388bb3aae9ca7cd5e3947067e454bed02ca9f94" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.718527 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmzsm\" (UniqueName: \"kubernetes.io/projected/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-kube-api-access-zmzsm\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.718557 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d95db954-8e59-44ac-ae17-788b0fbcb177-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.718568 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.718578 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.718586 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg5lf\" (UniqueName: \"kubernetes.io/projected/d95db954-8e59-44ac-ae17-788b0fbcb177-kube-api-access-lg5lf\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.740210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" (UID: "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.741985 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.742930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" event={"ID":"ec33de7c-5eab-46d0-a702-af5fbd2ebe50","Type":"ContainerDied","Data":"f3053221a5c4225b8ab8bbcfe202970a3280bafc0b4d9adbc95c42398e2680ad"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.756744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config" (OuterVolumeSpecName: "config") pod "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" (UID: "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.761035 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kvm68" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.771249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" (UID: "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.773054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data" (OuterVolumeSpecName: "config-data") pod "d95db954-8e59-44ac-ae17-788b0fbcb177" (UID: "d95db954-8e59-44ac-ae17-788b0fbcb177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.777806 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="ceilometer-notification-agent" containerID="cri-o://ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667" gracePeriod=30 Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.778033 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="sg-core" containerID="cri-o://47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe" gracePeriod=30 Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.778133 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="proxy-httpd" containerID="cri-o://49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc" gracePeriod=30 Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.790483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" (UID: "79b9f413-5078-4cb7-9515-a4d1b2d4bcbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.819520 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d95db954-8e59-44ac-ae17-788b0fbcb177-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.819552 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.819562 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.819572 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.819581 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.951843 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.951883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kvm68" event={"ID":"79b9f413-5078-4cb7-9515-a4d1b2d4bcbe","Type":"ContainerDied","Data":"0a104f73fd0489f44f500be7d4981a8eb5712541760c1262764ed6d49887d0ca"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.951948 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.951961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerStarted","Data":"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc"} Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.951974 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d867cf67d-79hj9"] Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.967244 4743 scope.go:117] "RemoveContainer" containerID="0a29bf0c08cff52754f4d7db81bd5cd48b9e8bc58a0e2f597b17c43d9fac0d1a" Jan 22 14:04:53 crc kubenswrapper[4743]: I0122 14:04:53.991206 4743 scope.go:117] "RemoveContainer" containerID="a7ce03f23852e506dc7e45e8492edf2520c808025e61af270d4d1ef7ca5dd529" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.007318 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64fcd75458-9rzfr" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.081906 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.089258 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-665544959-z46xh"] Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.093133 4743 scope.go:117] "RemoveContainer" containerID="e2bf69741d6eba79e58f8d45e7263a4a86646748b9354326504711e241a2a596" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.098399 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.098663 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d596989d-xmgwp" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api-log" containerID="cri-o://1914d432dbf22013451fd6d692ceaf77a94a8b00223aa1c44f08e68d9446457e" gracePeriod=30 Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.099222 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d596989d-xmgwp" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" containerID="cri-o://2306c268bbd8a02a9ba0e62cd3ea6575a384efe9651061abb1bfb85a27b2c6b6" gracePeriod=30 Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.115224 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-59d596989d-xmgwp" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.119228 4743 scope.go:117] "RemoveContainer" containerID="40a1b5bf70330c8a1ed129bafb0e99705e6da790c0a3711230914b5e5b2cad28" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.133022 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.140883 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kvm68"] Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.166441 4743 scope.go:117] "RemoveContainer" containerID="75992c52a7511e1d2053dba8c589a83081fd8a269b9b752a3790b2fda37df1b0" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.202944 4743 scope.go:117] "RemoveContainer" containerID="f1deba13d96d7c69a7d9bc6b56ac1f77743f928878db2848b1fabff6044bb634" Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.779614 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerID="1914d432dbf22013451fd6d692ceaf77a94a8b00223aa1c44f08e68d9446457e" exitCode=143 Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.779723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerDied","Data":"1914d432dbf22013451fd6d692ceaf77a94a8b00223aa1c44f08e68d9446457e"} Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.783717 4743 generic.go:334] "Generic (PLEG): container finished" podID="316dc631-a7ed-49db-9dad-305d246bf91a" containerID="49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc" exitCode=0 Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.783759 4743 generic.go:334] "Generic (PLEG): container finished" podID="316dc631-a7ed-49db-9dad-305d246bf91a" containerID="47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe" exitCode=2 Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.783797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerDied","Data":"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc"} Jan 22 14:04:54 crc kubenswrapper[4743]: I0122 14:04:54.783830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerDied","Data":"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe"} Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.161829 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-srjxw" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.252102 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgm5q\" (UniqueName: \"kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253637 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253678 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle\") pod \"eb22345c-594c-46a3-b362-e34baa8f271c\" (UID: \"eb22345c-594c-46a3-b362-e34baa8f271c\") " Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.253691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.254343 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb22345c-594c-46a3-b362-e34baa8f271c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.258210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.258235 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q" (OuterVolumeSpecName: "kube-api-access-kgm5q") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "kube-api-access-kgm5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.258835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts" (OuterVolumeSpecName: "scripts") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.300205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.307339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data" (OuterVolumeSpecName: "config-data") pod "eb22345c-594c-46a3-b362-e34baa8f271c" (UID: "eb22345c-594c-46a3-b362-e34baa8f271c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.356471 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.356525 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.356540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgm5q\" (UniqueName: \"kubernetes.io/projected/eb22345c-594c-46a3-b362-e34baa8f271c-kube-api-access-kgm5q\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.356555 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.356567 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb22345c-594c-46a3-b362-e34baa8f271c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.759415 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" path="/var/lib/kubelet/pods/53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b/volumes" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.760193 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" path="/var/lib/kubelet/pods/79b9f413-5078-4cb7-9515-a4d1b2d4bcbe/volumes" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.760803 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" path="/var/lib/kubelet/pods/d95db954-8e59-44ac-ae17-788b0fbcb177/volumes" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.796083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-srjxw" event={"ID":"eb22345c-594c-46a3-b362-e34baa8f271c","Type":"ContainerDied","Data":"9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28"} Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.796119 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b88b9a261257482cf463a09386888007c96580a91b2e76a7288c0afe171ff28" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.796152 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-srjxw" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959111 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959672 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959683 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959704 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="init" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959709 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="init" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959730 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959736 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener-log" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959750 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" containerName="cinder-db-sync" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959755 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" containerName="cinder-db-sync" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959768 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="dnsmasq-dns" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959774 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="dnsmasq-dns" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959800 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959806 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker-log" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959817 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959823 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959833 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959838 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener" Jan 22 14:04:55 crc kubenswrapper[4743]: E0122 14:04:55.959847 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.959853 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960014 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" containerName="cinder-db-sync" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960027 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960040 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960052 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" containerName="barbican-keystone-listener" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960063 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker-log" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960071 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b9f413-5078-4cb7-9515-a4d1b2d4bcbe" containerName="dnsmasq-dns" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960078 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a0fe55-3f9c-4ccf-9a15-c1fd564cbb7b" containerName="barbican-api" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960084 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95db954-8e59-44ac-ae17-788b0fbcb177" containerName="barbican-worker" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.960942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.966392 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wzr2p" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.966584 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.967839 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.968199 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 22 14:04:55 crc kubenswrapper[4743]: I0122 14:04:55.975647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.024158 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.025545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.041572 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2zg\" (UniqueName: \"kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhq9\" (UniqueName: \"kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.070722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2zg\" (UniqueName: \"kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfhq9\" (UniqueName: \"kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.174831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.175011 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.175645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.175710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.176252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.176534 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.178407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.182616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.182617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.197499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.197750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.201146 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2zg\" (UniqueName: \"kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg\") pod \"cinder-scheduler-0\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.218834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfhq9\" (UniqueName: \"kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9\") pod \"dnsmasq-dns-5c9776ccc5-wr2mc\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.230657 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.233306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.235209 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.253993 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.276955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277257 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvd67\" (UniqueName: \"kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.277772 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.287395 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.343178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379598 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd67\" (UniqueName: \"kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.379974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.383621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.385241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.386548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.386738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.387471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.407191 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd67\" (UniqueName: \"kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67\") pod \"cinder-api-0\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.602714 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.840590 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:04:56 crc kubenswrapper[4743]: I0122 14:04:56.936982 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.091495 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:04:57 crc kubenswrapper[4743]: W0122 14:04:57.101940 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1e36bbf_2ba3_4a7f_8020_06772e388f95.slice/crio-a3d5df7253923d072c6afbe6544afa5d36cfd863a5afcd13662b26872d5df83c WatchSource:0}: Error finding container a3d5df7253923d072c6afbe6544afa5d36cfd863a5afcd13662b26872d5df83c: Status 404 returned error can't find the container with id a3d5df7253923d072c6afbe6544afa5d36cfd863a5afcd13662b26872d5df83c Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.826016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerStarted","Data":"093fec351eda663ea46fe864a834b7821d46106c12401a5ac2744417bc40113e"} Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.829485 4743 generic.go:334] "Generic (PLEG): container finished" podID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerID="baaf16cd914646de7851d96e105c3d8faf6de71f0175e51a881efbd69e037df1" exitCode=0 Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.829560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" event={"ID":"081eea2f-bf2e-435b-bdfe-61b2311d7e10","Type":"ContainerDied","Data":"baaf16cd914646de7851d96e105c3d8faf6de71f0175e51a881efbd69e037df1"} Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.829585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" event={"ID":"081eea2f-bf2e-435b-bdfe-61b2311d7e10","Type":"ContainerStarted","Data":"cc18024b6e88e0a162afbfe4112fbaf16573a81a5939b62989207470a8e066c2"} Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.833431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerStarted","Data":"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5"} Jan 22 14:04:57 crc kubenswrapper[4743]: I0122 14:04:57.833495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerStarted","Data":"a3d5df7253923d072c6afbe6544afa5d36cfd863a5afcd13662b26872d5df83c"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.138273 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.564712 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d596989d-xmgwp" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:42752->10.217.0.164:9311: read: connection reset by peer" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.564730 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d596989d-xmgwp" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:42746->10.217.0.164:9311: read: connection reset by peer" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.578038 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.641726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.641808 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.641921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm42k\" (UniqueName: \"kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.641994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.642034 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.642074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.642102 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml\") pod \"316dc631-a7ed-49db-9dad-305d246bf91a\" (UID: \"316dc631-a7ed-49db-9dad-305d246bf91a\") " Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.642592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.644531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.656933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts" (OuterVolumeSpecName: "scripts") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.675988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k" (OuterVolumeSpecName: "kube-api-access-vm42k") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "kube-api-access-vm42k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.707260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.727101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743867 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743901 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743911 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm42k\" (UniqueName: \"kubernetes.io/projected/316dc631-a7ed-49db-9dad-305d246bf91a-kube-api-access-vm42k\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743923 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743931 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316dc631-a7ed-49db-9dad-305d246bf91a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.743939 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.757463 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data" (OuterVolumeSpecName: "config-data") pod "316dc631-a7ed-49db-9dad-305d246bf91a" (UID: "316dc631-a7ed-49db-9dad-305d246bf91a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.846580 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316dc631-a7ed-49db-9dad-305d246bf91a-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.865056 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerID="2306c268bbd8a02a9ba0e62cd3ea6575a384efe9651061abb1bfb85a27b2c6b6" exitCode=0 Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.865196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerDied","Data":"2306c268bbd8a02a9ba0e62cd3ea6575a384efe9651061abb1bfb85a27b2c6b6"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.874426 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerStarted","Data":"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.874876 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api-log" containerID="cri-o://685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" gracePeriod=30 Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.875452 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.875514 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api" containerID="cri-o://a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" gracePeriod=30 Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.883952 4743 generic.go:334] "Generic (PLEG): container finished" podID="316dc631-a7ed-49db-9dad-305d246bf91a" containerID="ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667" exitCode=0 Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.884098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerDied","Data":"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.884163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316dc631-a7ed-49db-9dad-305d246bf91a","Type":"ContainerDied","Data":"f1a83ffb81d64069d982879b339bbda55df7ee602f62806aba4fd32f65c6f9a3"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.884101 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.884191 4743 scope.go:117] "RemoveContainer" containerID="49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.890421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerStarted","Data":"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.904317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" event={"ID":"081eea2f-bf2e-435b-bdfe-61b2311d7e10","Type":"ContainerStarted","Data":"f683aef5f31dd4851d8dbb75c202dfbe1a7f3740660a38626bac089b531cd6b6"} Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.904669 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.904625286 podStartE2EDuration="2.904625286s" podCreationTimestamp="2026-01-22 14:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:58.900027817 +0000 UTC m=+1135.455070980" watchObservedRunningTime="2026-01-22 14:04:58.904625286 +0000 UTC m=+1135.459668449" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.905074 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.944696 4743 scope.go:117] "RemoveContainer" containerID="47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe" Jan 22 14:04:58 crc kubenswrapper[4743]: I0122 14:04:58.945292 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" podStartSLOduration=3.9452689039999997 podStartE2EDuration="3.945268904s" podCreationTimestamp="2026-01-22 14:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:04:58.937278388 +0000 UTC m=+1135.492321561" watchObservedRunningTime="2026-01-22 14:04:58.945268904 +0000 UTC m=+1135.500312067" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.003122 4743 scope.go:117] "RemoveContainer" containerID="ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.043632 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.064219 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.078927 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.079482 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="proxy-httpd" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079499 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="proxy-httpd" Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.079535 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="ceilometer-notification-agent" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079545 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="ceilometer-notification-agent" Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.079560 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="sg-core" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079566 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="sg-core" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079729 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="sg-core" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079762 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="proxy-httpd" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.079770 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" containerName="ceilometer-notification-agent" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.081387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.083665 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.086519 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.089133 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.120966 4743 scope.go:117] "RemoveContainer" containerID="49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc" Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.125541 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc\": container with ID starting with 49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc not found: ID does not exist" containerID="49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.125579 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc"} err="failed to get container status \"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc\": rpc error: code = NotFound desc = could not find container \"49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc\": container with ID starting with 49c9d9f85656145073dafeb1dc763ae12471a846d64af47c9481ead92ab0eabc not found: ID does not exist" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.125608 4743 scope.go:117] "RemoveContainer" containerID="47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe" Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.126035 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe\": container with ID starting with 47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe not found: ID does not exist" containerID="47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.126057 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe"} err="failed to get container status \"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe\": rpc error: code = NotFound desc = could not find container \"47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe\": container with ID starting with 47854176fb6a300bbd538181f0a9c90e70c288f01bad265b81c78fa697ed1dfe not found: ID does not exist" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.126071 4743 scope.go:117] "RemoveContainer" containerID="ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667" Jan 22 14:04:59 crc kubenswrapper[4743]: E0122 14:04:59.126391 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667\": container with ID starting with ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667 not found: ID does not exist" containerID="ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.126409 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667"} err="failed to get container status \"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667\": rpc error: code = NotFound desc = could not find container \"ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667\": container with ID starting with ba8ff1d54e4f3c6866568508cdba261acf4b40803abbc347435a8e242304a667 not found: ID does not exist" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk72v\" (UniqueName: \"kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153534 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.153560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.238448 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.257549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs\") pod \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.257621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data\") pod \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.257810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m67tq\" (UniqueName: \"kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq\") pod \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.258386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom\") pod \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.258444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle\") pod \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\" (UID: \"1e9f28d1-fbb1-4c58-9a9b-3439b902505a\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.259479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk72v\" (UniqueName: \"kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.274704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs" (OuterVolumeSpecName: "logs") pod "1e9f28d1-fbb1-4c58-9a9b-3439b902505a" (UID: "1e9f28d1-fbb1-4c58-9a9b-3439b902505a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.274982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.276813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.277438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq" (OuterVolumeSpecName: "kube-api-access-m67tq") pod "1e9f28d1-fbb1-4c58-9a9b-3439b902505a" (UID: "1e9f28d1-fbb1-4c58-9a9b-3439b902505a"). InnerVolumeSpecName "kube-api-access-m67tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.278068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.278590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.281227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.281677 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk72v\" (UniqueName: \"kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.283224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts\") pod \"ceilometer-0\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.307957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1e9f28d1-fbb1-4c58-9a9b-3439b902505a" (UID: "1e9f28d1-fbb1-4c58-9a9b-3439b902505a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.322102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e9f28d1-fbb1-4c58-9a9b-3439b902505a" (UID: "1e9f28d1-fbb1-4c58-9a9b-3439b902505a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.349708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data" (OuterVolumeSpecName: "config-data") pod "1e9f28d1-fbb1-4c58-9a9b-3439b902505a" (UID: "1e9f28d1-fbb1-4c58-9a9b-3439b902505a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.370194 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.370239 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.370253 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m67tq\" (UniqueName: \"kubernetes.io/projected/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-kube-api-access-m67tq\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.370269 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.370283 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9f28d1-fbb1-4c58-9a9b-3439b902505a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.393309 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.406781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.471547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.471709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.471779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.471930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.472015 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.472040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvd67\" (UniqueName: \"kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.472084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data\") pod \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\" (UID: \"b1e36bbf-2ba3-4a7f-8020-06772e388f95\") " Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.473629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs" (OuterVolumeSpecName: "logs") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.473690 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.485527 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67" (OuterVolumeSpecName: "kube-api-access-xvd67") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "kube-api-access-xvd67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.493147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts" (OuterVolumeSpecName: "scripts") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.494638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.536885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.557067 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data" (OuterVolumeSpecName: "config-data") pod "b1e36bbf-2ba3-4a7f-8020-06772e388f95" (UID: "b1e36bbf-2ba3-4a7f-8020-06772e388f95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574431 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574459 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvd67\" (UniqueName: \"kubernetes.io/projected/b1e36bbf-2ba3-4a7f-8020-06772e388f95-kube-api-access-xvd67\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574472 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574480 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574489 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e36bbf-2ba3-4a7f-8020-06772e388f95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574497 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1e36bbf-2ba3-4a7f-8020-06772e388f95-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.574505 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e36bbf-2ba3-4a7f-8020-06772e388f95-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.762980 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316dc631-a7ed-49db-9dad-305d246bf91a" path="/var/lib/kubelet/pods/316dc631-a7ed-49db-9dad-305d246bf91a/volumes" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.925634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerStarted","Data":"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad"} Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.930907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d596989d-xmgwp" event={"ID":"1e9f28d1-fbb1-4c58-9a9b-3439b902505a","Type":"ContainerDied","Data":"72b448bad62e3aebc5e2ef3477478535c7413d29992ed9b309560ebf787502b8"} Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.930966 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d596989d-xmgwp" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.930996 4743 scope.go:117] "RemoveContainer" containerID="2306c268bbd8a02a9ba0e62cd3ea6575a384efe9651061abb1bfb85a27b2c6b6" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934422 4743 generic.go:334] "Generic (PLEG): container finished" podID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerID="a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" exitCode=0 Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934449 4743 generic.go:334] "Generic (PLEG): container finished" podID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerID="685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" exitCode=143 Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934526 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerDied","Data":"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee"} Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerDied","Data":"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5"} Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.934652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b1e36bbf-2ba3-4a7f-8020-06772e388f95","Type":"ContainerDied","Data":"a3d5df7253923d072c6afbe6544afa5d36cfd863a5afcd13662b26872d5df83c"} Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.956189 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.239603754 podStartE2EDuration="4.956170079s" podCreationTimestamp="2026-01-22 14:04:55 +0000 UTC" firstStartedPulling="2026-01-22 14:04:56.835339973 +0000 UTC m=+1133.390383136" lastFinishedPulling="2026-01-22 14:04:57.551906298 +0000 UTC m=+1134.106949461" observedRunningTime="2026-01-22 14:04:59.954562847 +0000 UTC m=+1136.509606020" watchObservedRunningTime="2026-01-22 14:04:59.956170079 +0000 UTC m=+1136.511213242" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.965728 4743 scope.go:117] "RemoveContainer" containerID="1914d432dbf22013451fd6d692ceaf77a94a8b00223aa1c44f08e68d9446457e" Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.993464 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:04:59 crc kubenswrapper[4743]: I0122 14:04:59.998152 4743 scope.go:117] "RemoveContainer" containerID="a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.004814 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.019911 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59d596989d-xmgwp"] Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.033297 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.045173 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.057974 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.058985 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059010 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api" Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.059039 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059047 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.059111 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059174 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.059190 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059197 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059513 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059539 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" containerName="barbican-api" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059551 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.059560 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" containerName="cinder-api-log" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.061971 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.066176 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.066400 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.066872 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.091112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.093344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5aa29c0-68de-446c-aafd-50080e4adb51-logs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.093529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.093654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5aa29c0-68de-446c-aafd-50080e4adb51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.093767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.094002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-scripts\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.094746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.094863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-598d2\" (UniqueName: \"kubernetes.io/projected/d5aa29c0-68de-446c-aafd-50080e4adb51-kube-api-access-598d2\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.096759 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.096926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.101576 4743 scope.go:117] "RemoveContainer" containerID="685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.144833 4743 scope.go:117] "RemoveContainer" containerID="a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.145282 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee\": container with ID starting with a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee not found: ID does not exist" containerID="a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145315 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee"} err="failed to get container status \"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee\": rpc error: code = NotFound desc = could not find container \"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee\": container with ID starting with a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee not found: ID does not exist" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145344 4743 scope.go:117] "RemoveContainer" containerID="685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" Jan 22 14:05:00 crc kubenswrapper[4743]: E0122 14:05:00.145585 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5\": container with ID starting with 685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5 not found: ID does not exist" containerID="685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145634 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5"} err="failed to get container status \"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5\": rpc error: code = NotFound desc = could not find container \"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5\": container with ID starting with 685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5 not found: ID does not exist" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145653 4743 scope.go:117] "RemoveContainer" containerID="a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145885 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee"} err="failed to get container status \"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee\": rpc error: code = NotFound desc = could not find container \"a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee\": container with ID starting with a185841d88341f1b1e2fb4bd216d2e6337a2b124e83161f3feea0a55e26f9aee not found: ID does not exist" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.145903 4743 scope.go:117] "RemoveContainer" containerID="685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.146112 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5"} err="failed to get container status \"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5\": rpc error: code = NotFound desc = could not find container \"685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5\": container with ID starting with 685b31e46f8a70ce564b6c465f2b810b3e3f57c7be0df4355661bcbf6b7ab8b5 not found: ID does not exist" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5aa29c0-68de-446c-aafd-50080e4adb51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5aa29c0-68de-446c-aafd-50080e4adb51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-scripts\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-598d2\" (UniqueName: \"kubernetes.io/projected/d5aa29c0-68de-446c-aafd-50080e4adb51-kube-api-access-598d2\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.199612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5aa29c0-68de-446c-aafd-50080e4adb51-logs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.200667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5aa29c0-68de-446c-aafd-50080e4adb51-logs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.203602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-scripts\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.204446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.204828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.204829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.205205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.205351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aa29c0-68de-446c-aafd-50080e4adb51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.223655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-598d2\" (UniqueName: \"kubernetes.io/projected/d5aa29c0-68de-446c-aafd-50080e4adb51-kube-api-access-598d2\") pod \"cinder-api-0\" (UID: \"d5aa29c0-68de-446c-aafd-50080e4adb51\") " pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.404553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.900128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 22 14:05:00 crc kubenswrapper[4743]: W0122 14:05:00.902072 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5aa29c0_68de_446c_aafd_50080e4adb51.slice/crio-5d27e718f935b4563c30f9653b4dffadb6de9d9b00c19056e6fc698c26074efb WatchSource:0}: Error finding container 5d27e718f935b4563c30f9653b4dffadb6de9d9b00c19056e6fc698c26074efb: Status 404 returned error can't find the container with id 5d27e718f935b4563c30f9653b4dffadb6de9d9b00c19056e6fc698c26074efb Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.950921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerStarted","Data":"ee132547c0b6c54c83fb2577c65fda82adaed445f3841289117598de79cf80d0"} Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.950959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerStarted","Data":"040ba91cfda2cb68a89f8b1e4cdabbe00b1f0cda3639385325f38449a400175e"} Jan 22 14:05:00 crc kubenswrapper[4743]: I0122 14:05:00.954475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5aa29c0-68de-446c-aafd-50080e4adb51","Type":"ContainerStarted","Data":"5d27e718f935b4563c30f9653b4dffadb6de9d9b00c19056e6fc698c26074efb"} Jan 22 14:05:01 crc kubenswrapper[4743]: I0122 14:05:01.287468 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 22 14:05:01 crc kubenswrapper[4743]: I0122 14:05:01.757751 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9f28d1-fbb1-4c58-9a9b-3439b902505a" path="/var/lib/kubelet/pods/1e9f28d1-fbb1-4c58-9a9b-3439b902505a/volumes" Jan 22 14:05:01 crc kubenswrapper[4743]: I0122 14:05:01.758850 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e36bbf-2ba3-4a7f-8020-06772e388f95" path="/var/lib/kubelet/pods/b1e36bbf-2ba3-4a7f-8020-06772e388f95/volumes" Jan 22 14:05:01 crc kubenswrapper[4743]: I0122 14:05:01.962941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerStarted","Data":"8e297796679ee08d29bfef321f5c231e33f9052b8c1995bd3ca6d2987c284e6e"} Jan 22 14:05:01 crc kubenswrapper[4743]: I0122 14:05:01.966892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5aa29c0-68de-446c-aafd-50080e4adb51","Type":"ContainerStarted","Data":"fcbd7bfdeba3ccf2ee77cb4a21e7a07384c753b9d3089542f9190f8a99c7c2fd"} Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.306544 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.492274 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.531008 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.576900 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.577295 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56bdc765bf-xnb2x" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-api" containerID="cri-o://1ee883debc82e7cb2aaeee5362b21ea7a1cdd7d98662ff17df82b291aa89fe64" gracePeriod=30 Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.577440 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56bdc765bf-xnb2x" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" containerID="cri-o://cbcaa6d5f04c83ed4dd21bb53079e388e02f2bff5fed9a9ac0baa4452e704d7b" gracePeriod=30 Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.612864 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dd566fb89-mgkw8"] Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.614911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.619230 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dd566fb89-mgkw8"] Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.664730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-public-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.664806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-internal-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.665056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-ovndb-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.665098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-httpd-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.665115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gb7w\" (UniqueName: \"kubernetes.io/projected/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-kube-api-access-7gb7w\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.665596 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-combined-ca-bundle\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.665634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.679745 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56bdc765bf-xnb2x" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": read tcp 10.217.0.2:34326->10.217.0.154:9696: read: connection reset by peer" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.767858 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-public-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-internal-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-ovndb-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-httpd-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gb7w\" (UniqueName: \"kubernetes.io/projected/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-kube-api-access-7gb7w\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.769483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-combined-ca-bundle\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.774826 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.775103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-combined-ca-bundle\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.778685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-ovndb-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.778849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-httpd-config\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.782465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-internal-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.784604 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-public-tls-certs\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.795422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gb7w\" (UniqueName: \"kubernetes.io/projected/36675c4f-99e7-4cfb-a9c4-22519e8e7d4c-kube-api-access-7gb7w\") pod \"neutron-7dd566fb89-mgkw8\" (UID: \"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c\") " pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.937064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.979584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5aa29c0-68de-446c-aafd-50080e4adb51","Type":"ContainerStarted","Data":"eb9005d9904509297de5fa4cddd9708735acaf31e2384c766d0695e4fabca894"} Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.980733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.982250 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerID="cbcaa6d5f04c83ed4dd21bb53079e388e02f2bff5fed9a9ac0baa4452e704d7b" exitCode=0 Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.982294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerDied","Data":"cbcaa6d5f04c83ed4dd21bb53079e388e02f2bff5fed9a9ac0baa4452e704d7b"} Jan 22 14:05:02 crc kubenswrapper[4743]: I0122 14:05:02.985584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerStarted","Data":"00e0f26654334c4922acea2c86a35f807f4e288663ff54601bd90009e725164f"} Jan 22 14:05:03 crc kubenswrapper[4743]: I0122 14:05:03.002395 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.002374079 podStartE2EDuration="3.002374079s" podCreationTimestamp="2026-01-22 14:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:02.999322281 +0000 UTC m=+1139.554365444" watchObservedRunningTime="2026-01-22 14:05:03.002374079 +0000 UTC m=+1139.557417252" Jan 22 14:05:03 crc kubenswrapper[4743]: I0122 14:05:03.581897 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dd566fb89-mgkw8"] Jan 22 14:05:03 crc kubenswrapper[4743]: W0122 14:05:03.587464 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36675c4f_99e7_4cfb_a9c4_22519e8e7d4c.slice/crio-467e63007056996a6d1095b2410bc588ed2dad668b90eb944326cc231c33d0ac WatchSource:0}: Error finding container 467e63007056996a6d1095b2410bc588ed2dad668b90eb944326cc231c33d0ac: Status 404 returned error can't find the container with id 467e63007056996a6d1095b2410bc588ed2dad668b90eb944326cc231c33d0ac Jan 22 14:05:03 crc kubenswrapper[4743]: I0122 14:05:03.885815 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56bdc765bf-xnb2x" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": dial tcp 10.217.0.154:9696: connect: connection refused" Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.003241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerStarted","Data":"945789873f0895ce51715990de02fe0f7f1ae149b4318cc184e66adf86945b54"} Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.004455 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.005251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd566fb89-mgkw8" event={"ID":"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c","Type":"ContainerStarted","Data":"c6ca8d2ba4e1a301192508f16398364fc2406e679bc8f3ea1fe0adde399fe933"} Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.005335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd566fb89-mgkw8" event={"ID":"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c","Type":"ContainerStarted","Data":"467e63007056996a6d1095b2410bc588ed2dad668b90eb944326cc231c33d0ac"} Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.035992 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.691369594 podStartE2EDuration="6.035971858s" podCreationTimestamp="2026-01-22 14:04:58 +0000 UTC" firstStartedPulling="2026-01-22 14:05:00.024706656 +0000 UTC m=+1136.579749819" lastFinishedPulling="2026-01-22 14:05:03.3693089 +0000 UTC m=+1139.924352083" observedRunningTime="2026-01-22 14:05:04.023312562 +0000 UTC m=+1140.578355715" watchObservedRunningTime="2026-01-22 14:05:04.035971858 +0000 UTC m=+1140.591015021" Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.769399 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.901033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b7fb54dc6-5q9jf" Jan 22 14:05:04 crc kubenswrapper[4743]: I0122 14:05:04.968890 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:05:05 crc kubenswrapper[4743]: I0122 14:05:05.019014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd566fb89-mgkw8" event={"ID":"36675c4f-99e7-4cfb-a9c4-22519e8e7d4c","Type":"ContainerStarted","Data":"9edf49409f96f73334e6bedf7d8c5259557bfa998fa290b3826c8f3a5c1460c9"} Jan 22 14:05:05 crc kubenswrapper[4743]: I0122 14:05:05.019128 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon-log" containerID="cri-o://ac18ac033baaefe887d069af8ebe8ff96c5cd0940424da66bfb23d7a56c5a2f7" gracePeriod=30 Jan 22 14:05:05 crc kubenswrapper[4743]: I0122 14:05:05.019218 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" containerID="cri-o://63e33f82bc66858f7646d8d52106929147dca59a06e0ccf6f7838a6c4813eb9f" gracePeriod=30 Jan 22 14:05:05 crc kubenswrapper[4743]: I0122 14:05:05.019623 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:05 crc kubenswrapper[4743]: I0122 14:05:05.059739 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dd566fb89-mgkw8" podStartSLOduration=3.059715644 podStartE2EDuration="3.059715644s" podCreationTimestamp="2026-01-22 14:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:05.055596957 +0000 UTC m=+1141.610640120" watchObservedRunningTime="2026-01-22 14:05:05.059715644 +0000 UTC m=+1141.614758807" Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.345932 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.397188 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.397460 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="dnsmasq-dns" containerID="cri-o://c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef" gracePeriod=10 Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.566745 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.615801 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:06 crc kubenswrapper[4743]: I0122 14:05:06.967841 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039054 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerID="c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef" exitCode=0 Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039178 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" event={"ID":"ddd12850-b0cc-4119-9ba4-bf5a893f41a7","Type":"ContainerDied","Data":"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef"} Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" event={"ID":"ddd12850-b0cc-4119-9ba4-bf5a893f41a7","Type":"ContainerDied","Data":"7f7cb657e0f6bf06518eaa7a3ca82447446c5bf67ab8fdf1eec55c57045171a1"} Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039231 4743 scope.go:117] "RemoveContainer" containerID="c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039268 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="cinder-scheduler" containerID="cri-o://13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d" gracePeriod=30 Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039523 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kx6w" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.039507 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="probe" containerID="cri-o://8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad" gracePeriod=30 Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.081911 4743 scope.go:117] "RemoveContainer" containerID="d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.085657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.085733 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.085856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.085926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.086006 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdnx\" (UniqueName: \"kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.086043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb\") pod \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\" (UID: \"ddd12850-b0cc-4119-9ba4-bf5a893f41a7\") " Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.096085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx" (OuterVolumeSpecName: "kube-api-access-dkdnx") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "kube-api-access-dkdnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.141552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.144010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config" (OuterVolumeSpecName: "config") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.144478 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.154086 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.161480 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ddd12850-b0cc-4119-9ba4-bf5a893f41a7" (UID: "ddd12850-b0cc-4119-9ba4-bf5a893f41a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188116 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkdnx\" (UniqueName: \"kubernetes.io/projected/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-kube-api-access-dkdnx\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188156 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188169 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188181 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188191 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.188202 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd12850-b0cc-4119-9ba4-bf5a893f41a7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.217680 4743 scope.go:117] "RemoveContainer" containerID="c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef" Jan 22 14:05:07 crc kubenswrapper[4743]: E0122 14:05:07.218297 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef\": container with ID starting with c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef not found: ID does not exist" containerID="c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.218361 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef"} err="failed to get container status \"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef\": rpc error: code = NotFound desc = could not find container \"c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef\": container with ID starting with c02664d16f654a79c5e359a21022d4bccac4eb6257f210bf0f74ef7041f53bef not found: ID does not exist" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.218389 4743 scope.go:117] "RemoveContainer" containerID="d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f" Jan 22 14:05:07 crc kubenswrapper[4743]: E0122 14:05:07.218771 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f\": container with ID starting with d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f not found: ID does not exist" containerID="d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.218824 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f"} err="failed to get container status \"d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f\": rpc error: code = NotFound desc = could not find container \"d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f\": container with ID starting with d06cdf7eee3fe2827c4473a8c5402be79592c9f379b38a5d187088d4468b107f not found: ID does not exist" Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.377707 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.388862 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kx6w"] Jan 22 14:05:07 crc kubenswrapper[4743]: I0122 14:05:07.759016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" path="/var/lib/kubelet/pods/ddd12850-b0cc-4119-9ba4-bf5a893f41a7/volumes" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.054074 4743 generic.go:334] "Generic (PLEG): container finished" podID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerID="8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad" exitCode=0 Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.054419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerDied","Data":"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad"} Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.056759 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerID="1ee883debc82e7cb2aaeee5362b21ea7a1cdd7d98662ff17df82b291aa89fe64" exitCode=0 Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.056837 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerDied","Data":"1ee883debc82e7cb2aaeee5362b21ea7a1cdd7d98662ff17df82b291aa89fe64"} Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.397260 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8bsg\" (UniqueName: \"kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515182 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.515573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs\") pod \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\" (UID: \"b8ee30f9-a6ed-4aa2-b834-facef3c284fe\") " Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.534923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.543845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg" (OuterVolumeSpecName: "kube-api-access-d8bsg") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "kube-api-access-d8bsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.608935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.609062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.618144 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8bsg\" (UniqueName: \"kubernetes.io/projected/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-kube-api-access-d8bsg\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.618185 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.618198 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.618209 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.620771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config" (OuterVolumeSpecName: "config") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.676969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.684824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b8ee30f9-a6ed-4aa2-b834-facef3c284fe" (UID: "b8ee30f9-a6ed-4aa2-b834-facef3c284fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.725294 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.725334 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:08 crc kubenswrapper[4743]: I0122 14:05:08.725346 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8ee30f9-a6ed-4aa2-b834-facef3c284fe-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.070090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc765bf-xnb2x" event={"ID":"b8ee30f9-a6ed-4aa2-b834-facef3c284fe","Type":"ContainerDied","Data":"afbba5dcab408eeada4700d4829489737116d02c99d0246230ad10478ef11325"} Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.070569 4743 scope.go:117] "RemoveContainer" containerID="cbcaa6d5f04c83ed4dd21bb53079e388e02f2bff5fed9a9ac0baa4452e704d7b" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.070488 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc765bf-xnb2x" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.085920 4743 generic.go:334] "Generic (PLEG): container finished" podID="dff52751-78f1-4c39-aa95-5d74a246151e" containerID="63e33f82bc66858f7646d8d52106929147dca59a06e0ccf6f7838a6c4813eb9f" exitCode=0 Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.085957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerDied","Data":"63e33f82bc66858f7646d8d52106929147dca59a06e0ccf6f7838a6c4813eb9f"} Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.114258 4743 scope.go:117] "RemoveContainer" containerID="1ee883debc82e7cb2aaeee5362b21ea7a1cdd7d98662ff17df82b291aa89fe64" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.118055 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.130201 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56bdc765bf-xnb2x"] Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.450344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.462329 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77bd86cd86-kqp9m" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.735615 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5f47b7b66b-mfhcg" Jan 22 14:05:09 crc kubenswrapper[4743]: I0122 14:05:09.782123 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" path="/var/lib/kubelet/pods/b8ee30f9-a6ed-4aa2-b834-facef3c284fe/volumes" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.343772 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.853556 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.973498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.974160 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.974278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.974210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.974482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2zg\" (UniqueName: \"kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.974716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.975135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle\") pod \"60ee22a5-d036-40fb-8bed-36f10db21bb5\" (UID: \"60ee22a5-d036-40fb-8bed-36f10db21bb5\") " Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.976225 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ee22a5-d036-40fb-8bed-36f10db21bb5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.981173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg" (OuterVolumeSpecName: "kube-api-access-vm2zg") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "kube-api-access-vm2zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.987610 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:10 crc kubenswrapper[4743]: I0122 14:05:10.996378 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts" (OuterVolumeSpecName: "scripts") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.047521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.078857 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2zg\" (UniqueName: \"kubernetes.io/projected/60ee22a5-d036-40fb-8bed-36f10db21bb5-kube-api-access-vm2zg\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.078911 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.078925 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.078938 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data" (OuterVolumeSpecName: "config-data") pod "60ee22a5-d036-40fb-8bed-36f10db21bb5" (UID: "60ee22a5-d036-40fb-8bed-36f10db21bb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108647 4743 generic.go:334] "Generic (PLEG): container finished" podID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerID="13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d" exitCode=0 Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerDied","Data":"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d"} Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ee22a5-d036-40fb-8bed-36f10db21bb5","Type":"ContainerDied","Data":"093fec351eda663ea46fe864a834b7821d46106c12401a5ac2744417bc40113e"} Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108708 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.108878 4743 scope.go:117] "RemoveContainer" containerID="8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.180690 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ee22a5-d036-40fb-8bed-36f10db21bb5-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.188634 4743 scope.go:117] "RemoveContainer" containerID="13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.192978 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.203527 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.215513 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221535 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="cinder-scheduler" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221578 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="cinder-scheduler" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221592 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-api" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221599 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-api" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221620 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221630 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221656 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="dnsmasq-dns" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221664 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="dnsmasq-dns" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221682 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="probe" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221689 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="probe" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.221710 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="init" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221715 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="init" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221983 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-api" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.221996 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd12850-b0cc-4119-9ba4-bf5a893f41a7" containerName="dnsmasq-dns" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.222010 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="cinder-scheduler" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.222025 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" containerName="probe" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.222035 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ee30f9-a6ed-4aa2-b834-facef3c284fe" containerName="neutron-httpd" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.222922 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.229958 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.230249 4743 scope.go:117] "RemoveContainer" containerID="8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.231024 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad\": container with ID starting with 8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad not found: ID does not exist" containerID="8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.231062 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad"} err="failed to get container status \"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad\": rpc error: code = NotFound desc = could not find container \"8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad\": container with ID starting with 8c8376deff29240c341cc09dfdef769bdc935cf5168714e8ba37ae6d1b7d4cad not found: ID does not exist" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.231089 4743 scope.go:117] "RemoveContainer" containerID="13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d" Jan 22 14:05:11 crc kubenswrapper[4743]: E0122 14:05:11.231590 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d\": container with ID starting with 13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d not found: ID does not exist" containerID="13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.231618 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d"} err="failed to get container status \"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d\": rpc error: code = NotFound desc = could not find container \"13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d\": container with ID starting with 13f86a89fb0a36d4cffccf06926cc17c5ce4bdd5eb06ce972c334d3f6e42d40d not found: ID does not exist" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.247002 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.283902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.284029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.284062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.284084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.284121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.284175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4b5\" (UniqueName: \"kubernetes.io/projected/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-kube-api-access-qp4b5\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.387104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.387234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.387240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.387270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.387380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.388058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.388296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4b5\" (UniqueName: \"kubernetes.io/projected/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-kube-api-access-qp4b5\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.396169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.398687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.399450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.399456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.410326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4b5\" (UniqueName: \"kubernetes.io/projected/ec6e51f6-2808-404d-8cf0-8c8b44c86cb9-kube-api-access-qp4b5\") pod \"cinder-scheduler-0\" (UID: \"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9\") " pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.544559 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.781740 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ee22a5-d036-40fb-8bed-36f10db21bb5" path="/var/lib/kubelet/pods/60ee22a5-d036-40fb-8bed-36f10db21bb5/volumes" Jan 22 14:05:11 crc kubenswrapper[4743]: I0122 14:05:11.976201 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 22 14:05:12 crc kubenswrapper[4743]: I0122 14:05:12.131217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9","Type":"ContainerStarted","Data":"a5fe75da470e3101b86738d2e9bc1aba92a6012695cfb31d8e4c846eeb9b0a3c"} Jan 22 14:05:12 crc kubenswrapper[4743]: I0122 14:05:12.685653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.292100 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.293110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.295694 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.296097 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.298021 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-26cp6" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.323761 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.437951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.438104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgqk\" (UniqueName: \"kubernetes.io/projected/41abc04c-e711-4e34-a0b0-085b7b09d94d-kube-api-access-psgqk\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.438215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config-secret\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.438428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.540437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config-secret\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.540547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.540590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.540627 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgqk\" (UniqueName: \"kubernetes.io/projected/41abc04c-e711-4e34-a0b0-085b7b09d94d-kube-api-access-psgqk\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.541580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.546721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.557673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/41abc04c-e711-4e34-a0b0-085b7b09d94d-openstack-config-secret\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.565527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgqk\" (UniqueName: \"kubernetes.io/projected/41abc04c-e711-4e34-a0b0-085b7b09d94d-kube-api-access-psgqk\") pod \"openstackclient\" (UID: \"41abc04c-e711-4e34-a0b0-085b7b09d94d\") " pod="openstack/openstackclient" Jan 22 14:05:13 crc kubenswrapper[4743]: I0122 14:05:13.621012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 22 14:05:14 crc kubenswrapper[4743]: I0122 14:05:14.158833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 22 14:05:14 crc kubenswrapper[4743]: I0122 14:05:14.173583 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9","Type":"ContainerStarted","Data":"ea2108029a0c1081ba13f83fb8d037a7abe627241d43e18f670aa5551253f5f8"} Jan 22 14:05:15 crc kubenswrapper[4743]: I0122 14:05:15.184595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"41abc04c-e711-4e34-a0b0-085b7b09d94d","Type":"ContainerStarted","Data":"7ea9d67ad132e55ee1c044444e025ee5405f963ded5e9ac788bb6081954c465c"} Jan 22 14:05:15 crc kubenswrapper[4743]: I0122 14:05:15.187080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec6e51f6-2808-404d-8cf0-8c8b44c86cb9","Type":"ContainerStarted","Data":"50a447613b70c266ccfdae74f4624edc2cf3bfb2f9f488a0b8133438d1cccb51"} Jan 22 14:05:15 crc kubenswrapper[4743]: I0122 14:05:15.212667 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.212645448 podStartE2EDuration="4.212645448s" podCreationTimestamp="2026-01-22 14:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:15.201439949 +0000 UTC m=+1151.756483112" watchObservedRunningTime="2026-01-22 14:05:15.212645448 +0000 UTC m=+1151.767688611" Jan 22 14:05:16 crc kubenswrapper[4743]: I0122 14:05:16.544715 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.091509 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6d6df4ffc5-49vw4"] Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.093747 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.095451 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.095878 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.096130 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.107868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d6df4ffc5-49vw4"] Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.135856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-run-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.135896 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-combined-ca-bundle\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.135934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-config-data\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.135949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-internal-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.135971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-log-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.136133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-public-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.136199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldmt\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-kube-api-access-tldmt\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.136390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-etc-swift\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237736 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-etc-swift\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-run-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-combined-ca-bundle\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-config-data\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-internal-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.237993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-log-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.238045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-public-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.238074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldmt\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-kube-api-access-tldmt\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.239281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-run-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.239953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-log-httpd\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.244396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-etc-swift\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.244544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-internal-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.244615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-public-tls-certs\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.249188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-config-data\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.251448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-combined-ca-bundle\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.267198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldmt\" (UniqueName: \"kubernetes.io/projected/33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5-kube-api-access-tldmt\") pod \"swift-proxy-6d6df4ffc5-49vw4\" (UID: \"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5\") " pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:18 crc kubenswrapper[4743]: I0122 14:05:18.421037 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:20 crc kubenswrapper[4743]: I0122 14:05:20.344055 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.648906 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.649450 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-central-agent" containerID="cri-o://ee132547c0b6c54c83fb2577c65fda82adaed445f3841289117598de79cf80d0" gracePeriod=30 Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.649504 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="sg-core" containerID="cri-o://00e0f26654334c4922acea2c86a35f807f4e288663ff54601bd90009e725164f" gracePeriod=30 Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.649562 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-notification-agent" containerID="cri-o://8e297796679ee08d29bfef321f5c231e33f9052b8c1995bd3ca6d2987c284e6e" gracePeriod=30 Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.649504 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="proxy-httpd" containerID="cri-o://945789873f0895ce51715990de02fe0f7f1ae149b4318cc184e66adf86945b54" gracePeriod=30 Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.661115 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Jan 22 14:05:21 crc kubenswrapper[4743]: I0122 14:05:21.784554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261685 4743 generic.go:334] "Generic (PLEG): container finished" podID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerID="945789873f0895ce51715990de02fe0f7f1ae149b4318cc184e66adf86945b54" exitCode=0 Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261724 4743 generic.go:334] "Generic (PLEG): container finished" podID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerID="00e0f26654334c4922acea2c86a35f807f4e288663ff54601bd90009e725164f" exitCode=2 Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261736 4743 generic.go:334] "Generic (PLEG): container finished" podID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerID="ee132547c0b6c54c83fb2577c65fda82adaed445f3841289117598de79cf80d0" exitCode=0 Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerDied","Data":"945789873f0895ce51715990de02fe0f7f1ae149b4318cc184e66adf86945b54"} Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerDied","Data":"00e0f26654334c4922acea2c86a35f807f4e288663ff54601bd90009e725164f"} Jan 22 14:05:22 crc kubenswrapper[4743]: I0122 14:05:22.261852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerDied","Data":"ee132547c0b6c54c83fb2577c65fda82adaed445f3841289117598de79cf80d0"} Jan 22 14:05:23 crc kubenswrapper[4743]: I0122 14:05:23.894699 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podec33de7c-5eab-46d0-a702-af5fbd2ebe50"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podec33de7c-5eab-46d0-a702-af5fbd2ebe50] : Timed out while waiting for systemd to remove kubepods-besteffort-podec33de7c_5eab_46d0_a702_af5fbd2ebe50.slice" Jan 22 14:05:23 crc kubenswrapper[4743]: E0122 14:05:23.895123 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podec33de7c-5eab-46d0-a702-af5fbd2ebe50] : unable to destroy cgroup paths for cgroup [kubepods besteffort podec33de7c-5eab-46d0-a702-af5fbd2ebe50] : Timed out while waiting for systemd to remove kubepods-besteffort-podec33de7c_5eab_46d0_a702_af5fbd2ebe50.slice" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.323720 4743 generic.go:334] "Generic (PLEG): container finished" podID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerID="8e297796679ee08d29bfef321f5c231e33f9052b8c1995bd3ca6d2987c284e6e" exitCode=0 Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.324145 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5774494bd8-6dt7x" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.325047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerDied","Data":"8e297796679ee08d29bfef321f5c231e33f9052b8c1995bd3ca6d2987c284e6e"} Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.366411 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.374249 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5774494bd8-6dt7x"] Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.509132 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk72v\" (UniqueName: \"kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558836 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558897 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.558998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.559015 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd\") pod \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\" (UID: \"e94b5f67-2b59-4f2a-bc90-20a355ddb79f\") " Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.559820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.559953 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.566017 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts" (OuterVolumeSpecName: "scripts") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.577366 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v" (OuterVolumeSpecName: "kube-api-access-wk72v") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "kube-api-access-wk72v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.596953 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.634675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.657332 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d6df4ffc5-49vw4"] Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660736 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk72v\" (UniqueName: \"kubernetes.io/projected/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-kube-api-access-wk72v\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660765 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660776 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660900 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660921 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.660930 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:24 crc kubenswrapper[4743]: W0122 14:05:24.661069 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33cc55b1_3375_4b12_9cd9_c8f34ed7c0f5.slice/crio-3e48461685f4d11b439dbc3d2383aa5d6276bafc62eb60ca3e9f856252439432 WatchSource:0}: Error finding container 3e48461685f4d11b439dbc3d2383aa5d6276bafc62eb60ca3e9f856252439432: Status 404 returned error can't find the container with id 3e48461685f4d11b439dbc3d2383aa5d6276bafc62eb60ca3e9f856252439432 Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.687062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data" (OuterVolumeSpecName: "config-data") pod "e94b5f67-2b59-4f2a-bc90-20a355ddb79f" (UID: "e94b5f67-2b59-4f2a-bc90-20a355ddb79f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:24 crc kubenswrapper[4743]: I0122 14:05:24.764074 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94b5f67-2b59-4f2a-bc90-20a355ddb79f-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.336292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e94b5f67-2b59-4f2a-bc90-20a355ddb79f","Type":"ContainerDied","Data":"040ba91cfda2cb68a89f8b1e4cdabbe00b1f0cda3639385325f38449a400175e"} Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.336591 4743 scope.go:117] "RemoveContainer" containerID="945789873f0895ce51715990de02fe0f7f1ae149b4318cc184e66adf86945b54" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.336366 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.341762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"41abc04c-e711-4e34-a0b0-085b7b09d94d","Type":"ContainerStarted","Data":"bc08ab349e001efec9dd7722fdf9f1ab5d4a71a964f7b8c8029530fec76cf85d"} Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.343734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" event={"ID":"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5","Type":"ContainerStarted","Data":"fb489ef837051da82189b930353b3baf47540c9be9400bf416f61cec96b6a94b"} Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.343775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" event={"ID":"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5","Type":"ContainerStarted","Data":"3e48461685f4d11b439dbc3d2383aa5d6276bafc62eb60ca3e9f856252439432"} Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.359918 4743 scope.go:117] "RemoveContainer" containerID="00e0f26654334c4922acea2c86a35f807f4e288663ff54601bd90009e725164f" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.366151 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.546176928 podStartE2EDuration="12.366132549s" podCreationTimestamp="2026-01-22 14:05:13 +0000 UTC" firstStartedPulling="2026-01-22 14:05:14.180641431 +0000 UTC m=+1150.735684594" lastFinishedPulling="2026-01-22 14:05:24.000597042 +0000 UTC m=+1160.555640215" observedRunningTime="2026-01-22 14:05:25.360125015 +0000 UTC m=+1161.915168178" watchObservedRunningTime="2026-01-22 14:05:25.366132549 +0000 UTC m=+1161.921175712" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.387866 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.391016 4743 scope.go:117] "RemoveContainer" containerID="8e297796679ee08d29bfef321f5c231e33f9052b8c1995bd3ca6d2987c284e6e" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.394722 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.427020 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:25 crc kubenswrapper[4743]: E0122 14:05:25.427751 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-notification-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.427771 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-notification-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: E0122 14:05:25.427802 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="proxy-httpd" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.427811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="proxy-httpd" Jan 22 14:05:25 crc kubenswrapper[4743]: E0122 14:05:25.427842 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="sg-core" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.427850 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="sg-core" Jan 22 14:05:25 crc kubenswrapper[4743]: E0122 14:05:25.427869 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-central-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.427876 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-central-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.428106 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="sg-core" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.428121 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-notification-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.428136 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="proxy-httpd" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.428147 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" containerName="ceilometer-central-agent" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.430097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.434988 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.435552 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.436040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.437646 4743 scope.go:117] "RemoveContainer" containerID="ee132547c0b6c54c83fb2577c65fda82adaed445f3841289117598de79cf80d0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481034 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qgzh\" (UniqueName: \"kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.481302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583685 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.583729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qgzh\" (UniqueName: \"kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.584323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.584671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.587808 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.588701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.589661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.599973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.608674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qgzh\" (UniqueName: \"kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh\") pod \"ceilometer-0\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.722660 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.723601 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.770672 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94b5f67-2b59-4f2a-bc90-20a355ddb79f" path="/var/lib/kubelet/pods/e94b5f67-2b59-4f2a-bc90-20a355ddb79f/volumes" Jan 22 14:05:25 crc kubenswrapper[4743]: I0122 14:05:25.771473 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec33de7c-5eab-46d0-a702-af5fbd2ebe50" path="/var/lib/kubelet/pods/ec33de7c-5eab-46d0-a702-af5fbd2ebe50/volumes" Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.221139 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:26 crc kubenswrapper[4743]: W0122 14:05:26.233063 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9884817b_7cf0_4495_8333_c5ee23796e4c.slice/crio-c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450 WatchSource:0}: Error finding container c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450: Status 404 returned error can't find the container with id c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450 Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.354648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" event={"ID":"33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5","Type":"ContainerStarted","Data":"07f0862946db39b509215eb8edac80385a561b6c08e8f256dd5d7ad3f6f417c4"} Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.356605 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.356655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.361829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerStarted","Data":"c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450"} Jan 22 14:05:26 crc kubenswrapper[4743]: I0122 14:05:26.397955 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" podStartSLOduration=8.397923423 podStartE2EDuration="8.397923423s" podCreationTimestamp="2026-01-22 14:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:26.390367758 +0000 UTC m=+1162.945410931" watchObservedRunningTime="2026-01-22 14:05:26.397923423 +0000 UTC m=+1162.952966586" Jan 22 14:05:27 crc kubenswrapper[4743]: I0122 14:05:27.371276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerStarted","Data":"713af1b4ec230c16cdb3eddd4bfc35785c2e3378fb5fcaeea472eaec4659461f"} Jan 22 14:05:28 crc kubenswrapper[4743]: I0122 14:05:28.389389 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerStarted","Data":"078ac27394f9fff68513124d88e902e9727547466628960c8cdc35f2212382bb"} Jan 22 14:05:29 crc kubenswrapper[4743]: I0122 14:05:29.400053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerStarted","Data":"10a1272b52eb776562992f9eb0adb59661b4af7634fab61a0530e84987e9ef64"} Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.343540 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-999bfcdc8-ldzdp" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.343659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerStarted","Data":"1f32d65f2422ae8296d9d22400c8ea0dfa0b052566f6b0c341c7ae5d0920ad7c"} Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413525 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-central-agent" containerID="cri-o://713af1b4ec230c16cdb3eddd4bfc35785c2e3378fb5fcaeea472eaec4659461f" gracePeriod=30 Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413774 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413556 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="proxy-httpd" containerID="cri-o://1f32d65f2422ae8296d9d22400c8ea0dfa0b052566f6b0c341c7ae5d0920ad7c" gracePeriod=30 Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413575 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-notification-agent" containerID="cri-o://078ac27394f9fff68513124d88e902e9727547466628960c8cdc35f2212382bb" gracePeriod=30 Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.413561 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="sg-core" containerID="cri-o://10a1272b52eb776562992f9eb0adb59661b4af7634fab61a0530e84987e9ef64" gracePeriod=30 Jan 22 14:05:30 crc kubenswrapper[4743]: I0122 14:05:30.445233 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.029166049 podStartE2EDuration="5.445208225s" podCreationTimestamp="2026-01-22 14:05:25 +0000 UTC" firstStartedPulling="2026-01-22 14:05:26.238447331 +0000 UTC m=+1162.793490494" lastFinishedPulling="2026-01-22 14:05:29.654489497 +0000 UTC m=+1166.209532670" observedRunningTime="2026-01-22 14:05:30.43727376 +0000 UTC m=+1166.992316923" watchObservedRunningTime="2026-01-22 14:05:30.445208225 +0000 UTC m=+1167.000251388" Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.466267 4743 generic.go:334] "Generic (PLEG): container finished" podID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerID="1f32d65f2422ae8296d9d22400c8ea0dfa0b052566f6b0c341c7ae5d0920ad7c" exitCode=0 Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.467210 4743 generic.go:334] "Generic (PLEG): container finished" podID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerID="10a1272b52eb776562992f9eb0adb59661b4af7634fab61a0530e84987e9ef64" exitCode=2 Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.467299 4743 generic.go:334] "Generic (PLEG): container finished" podID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerID="078ac27394f9fff68513124d88e902e9727547466628960c8cdc35f2212382bb" exitCode=0 Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.467383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerDied","Data":"1f32d65f2422ae8296d9d22400c8ea0dfa0b052566f6b0c341c7ae5d0920ad7c"} Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.467482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerDied","Data":"10a1272b52eb776562992f9eb0adb59661b4af7634fab61a0530e84987e9ef64"} Jan 22 14:05:31 crc kubenswrapper[4743]: I0122 14:05:31.467569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerDied","Data":"078ac27394f9fff68513124d88e902e9727547466628960c8cdc35f2212382bb"} Jan 22 14:05:32 crc kubenswrapper[4743]: I0122 14:05:32.954520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7dd566fb89-mgkw8" Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.016579 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.017039 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bd7ccdcfb-sx7wp" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-api" containerID="cri-o://1ae6a9f97c3b7fc665caf057bbe7901477be00bed6bc5e379b53f6b4f6925f62" gracePeriod=30 Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.017204 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bd7ccdcfb-sx7wp" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-httpd" containerID="cri-o://ef9ffc5cb4c24c27363689c60af953e2196889a603ec04ab66b8826de1d12f2b" gracePeriod=30 Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.434443 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.441698 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d6df4ffc5-49vw4" Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.483352 4743 generic.go:334] "Generic (PLEG): container finished" podID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerID="ef9ffc5cb4c24c27363689c60af953e2196889a603ec04ab66b8826de1d12f2b" exitCode=0 Jan 22 14:05:33 crc kubenswrapper[4743]: I0122 14:05:33.483428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerDied","Data":"ef9ffc5cb4c24c27363689c60af953e2196889a603ec04ab66b8826de1d12f2b"} Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.262343 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.262866 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-log" containerID="cri-o://b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b" gracePeriod=30 Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.263047 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-httpd" containerID="cri-o://29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7" gracePeriod=30 Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.491886 4743 generic.go:334] "Generic (PLEG): container finished" podID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerID="b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b" exitCode=143 Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.491926 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerDied","Data":"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b"} Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.793491 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bkr2b"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.801284 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.816517 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bkr2b"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.900209 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx56q\" (UniqueName: \"kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.900460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.930994 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hpj2q"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.932482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.945273 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hpj2q"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.981451 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-586d-account-create-update-nl5vc"] Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.992766 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:34 crc kubenswrapper[4743]: I0122 14:05:34.995593 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.000853 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-586d-account-create-update-nl5vc"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.002151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr76f\" (UniqueName: \"kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.002324 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx56q\" (UniqueName: \"kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.002461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.002642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.003399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.077092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx56q\" (UniqueName: \"kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q\") pod \"nova-api-db-create-bkr2b\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.105468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.105845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr76f\" (UniqueName: \"kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.105998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.106160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rl9z\" (UniqueName: \"kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.107104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.130178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.143381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr76f\" (UniqueName: \"kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f\") pod \"nova-cell0-db-create-hpj2q\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.166975 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tbj22"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.168349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.188660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbj22"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.213654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.214884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rl9z\" (UniqueName: \"kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.214727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.245635 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ec5d-account-create-update-p4vn8"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.247433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rl9z\" (UniqueName: \"kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z\") pod \"nova-api-586d-account-create-update-nl5vc\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.247543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.251025 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.272235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ec5d-account-create-update-p4vn8"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.277632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.319233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.319668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b5z\" (UniqueName: \"kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.319779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.319856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9lqh\" (UniqueName: \"kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.408145 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7a10-account-create-update-7hg8n"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.409896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.412451 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.416955 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7a10-account-create-update-7hg8n"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.432547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75b5z\" (UniqueName: \"kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.432657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.432709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9lqh\" (UniqueName: \"kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.432759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.433499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.433675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.451115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9lqh\" (UniqueName: \"kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh\") pod \"nova-cell0-ec5d-account-create-update-p4vn8\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.451547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b5z\" (UniqueName: \"kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z\") pod \"nova-cell1-db-create-tbj22\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.501817 4743 generic.go:334] "Generic (PLEG): container finished" podID="dff52751-78f1-4c39-aa95-5d74a246151e" containerID="ac18ac033baaefe887d069af8ebe8ff96c5cd0940424da66bfb23d7a56c5a2f7" exitCode=137 Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.501871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerDied","Data":"ac18ac033baaefe887d069af8ebe8ff96c5cd0940424da66bfb23d7a56c5a2f7"} Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.505290 4743 generic.go:334] "Generic (PLEG): container finished" podID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerID="713af1b4ec230c16cdb3eddd4bfc35785c2e3378fb5fcaeea472eaec4659461f" exitCode=0 Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.505326 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerDied","Data":"713af1b4ec230c16cdb3eddd4bfc35785c2e3378fb5fcaeea472eaec4659461f"} Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.534809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsg5t\" (UniqueName: \"kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.534908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.536109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.545246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.564874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.636586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.637069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsg5t\" (UniqueName: \"kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.637337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.662565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsg5t\" (UniqueName: \"kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t\") pod \"nova-cell1-7a10-account-create-update-7hg8n\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.741025 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bkr2b"] Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.824732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:35 crc kubenswrapper[4743]: I0122 14:05:35.855630 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hpj2q"] Jan 22 14:05:35 crc kubenswrapper[4743]: W0122 14:05:35.882293 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf55bf894_5195_430c_acc8_06875cadcdff.slice/crio-9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571 WatchSource:0}: Error finding container 9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571: Status 404 returned error can't find the container with id 9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571 Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.130499 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-586d-account-create-update-nl5vc"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.144345 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbj22"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.319649 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ec5d-account-create-update-p4vn8"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.349204 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.453527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.453872 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.453905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.453957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgjfg\" (UniqueName: \"kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.454020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.454071 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.454116 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle\") pod \"dff52751-78f1-4c39-aa95-5d74a246151e\" (UID: \"dff52751-78f1-4c39-aa95-5d74a246151e\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.455669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs" (OuterVolumeSpecName: "logs") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.498471 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.503717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg" (OuterVolumeSpecName: "kube-api-access-tgjfg") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "kube-api-access-tgjfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.508926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.518558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts" (OuterVolumeSpecName: "scripts") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.518978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data" (OuterVolumeSpecName: "config-data") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.519169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbj22" event={"ID":"1adbc0d6-0108-4060-9299-7d71187ac9e9","Type":"ContainerStarted","Data":"4f4591e555f7f6a3c29d9c280f31b43d719868a263282eac92c0b014205a3b70"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.526945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9884817b-7cf0-4495-8333-c5ee23796e4c","Type":"ContainerDied","Data":"c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.526983 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bd3d14416cbbd98f4aed63304262505fa274aceb412912bfe527d8324a2450" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.530318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" event={"ID":"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004","Type":"ContainerStarted","Data":"0c893ee0a0003e4ee822dfcebc0af5b534c1f98e96f11259026505182617dc4a"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.535610 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7a10-account-create-update-7hg8n"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.552127 4743 generic.go:334] "Generic (PLEG): container finished" podID="f55bf894-5195-430c-acc8-06875cadcdff" containerID="24fea48753ce9db7d55094d0a3530e1a63382dafd66143b0b797c8331de6408a" exitCode=0 Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.552269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hpj2q" event={"ID":"f55bf894-5195-430c-acc8-06875cadcdff","Type":"ContainerDied","Data":"24fea48753ce9db7d55094d0a3530e1a63382dafd66143b0b797c8331de6408a"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.552303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hpj2q" event={"ID":"f55bf894-5195-430c-acc8-06875cadcdff","Type":"ContainerStarted","Data":"9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.553313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "dff52751-78f1-4c39-aa95-5d74a246151e" (UID: "dff52751-78f1-4c39-aa95-5d74a246151e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555864 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555893 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555906 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555915 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgjfg\" (UniqueName: \"kubernetes.io/projected/dff52751-78f1-4c39-aa95-5d74a246151e-kube-api-access-tgjfg\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555924 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dff52751-78f1-4c39-aa95-5d74a246151e-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555934 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff52751-78f1-4c39-aa95-5d74a246151e-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.555942 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff52751-78f1-4c39-aa95-5d74a246151e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.556287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-586d-account-create-update-nl5vc" event={"ID":"43e926de-d57c-4d5b-82a6-2a28f645f18d","Type":"ContainerStarted","Data":"1775d43f2df17c12debd2b3a4f097de720575e178b7c0b0e70ca3490e39d6a9c"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.558732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-999bfcdc8-ldzdp" event={"ID":"dff52751-78f1-4c39-aa95-5d74a246151e","Type":"ContainerDied","Data":"269ef51223792898fc237f2544de07afb4f8e70db37f8fd29799954631ede201"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.558846 4743 scope.go:117] "RemoveContainer" containerID="63e33f82bc66858f7646d8d52106929147dca59a06e0ccf6f7838a6c4813eb9f" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.559054 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-999bfcdc8-ldzdp" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.591026 4743 generic.go:334] "Generic (PLEG): container finished" podID="da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" containerID="eb5eddb366431a8736a9dbf21c3b9f61087925ac7db940db6d13163edfb617a1" exitCode=0 Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.591075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bkr2b" event={"ID":"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7","Type":"ContainerDied","Data":"eb5eddb366431a8736a9dbf21c3b9f61087925ac7db940db6d13163edfb617a1"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.591102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bkr2b" event={"ID":"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7","Type":"ContainerStarted","Data":"1f69389e929a96c5371c6ab5626a4081f177df864e6649af4863f878c8ce5ca3"} Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.621809 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.622082 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-log" containerID="cri-o://03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913" gracePeriod=30 Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.622623 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-httpd" containerID="cri-o://13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02" gracePeriod=30 Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.798717 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.861995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qgzh\" (UniqueName: \"kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.862341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data\") pod \"9884817b-7cf0-4495-8333-c5ee23796e4c\" (UID: \"9884817b-7cf0-4495-8333-c5ee23796e4c\") " Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.868069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.869827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh" (OuterVolumeSpecName: "kube-api-access-7qgzh") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "kube-api-access-7qgzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.871344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.876114 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.886853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts" (OuterVolumeSpecName: "scripts") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.908534 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-999bfcdc8-ldzdp"] Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.964814 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.964857 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9884817b-7cf0-4495-8333-c5ee23796e4c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.964872 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:36 crc kubenswrapper[4743]: I0122 14:05:36.964883 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qgzh\" (UniqueName: \"kubernetes.io/projected/9884817b-7cf0-4495-8333-c5ee23796e4c-kube-api-access-7qgzh\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.081373 4743 scope.go:117] "RemoveContainer" containerID="ac18ac033baaefe887d069af8ebe8ff96c5cd0940424da66bfb23d7a56c5a2f7" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.118915 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.169036 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.192548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data" (OuterVolumeSpecName: "config-data") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.207980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9884817b-7cf0-4495-8333-c5ee23796e4c" (UID: "9884817b-7cf0-4495-8333-c5ee23796e4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.271067 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.271111 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9884817b-7cf0-4495-8333-c5ee23796e4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.599881 4743 generic.go:334] "Generic (PLEG): container finished" podID="07bcd65e-dcf5-4778-b34d-fba3728e8616" containerID="65a73678a5ad78e7585173de169f2d7a6d65b25e788a3d5d7c9fcee111fcd45d" exitCode=0 Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.599942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" event={"ID":"07bcd65e-dcf5-4778-b34d-fba3728e8616","Type":"ContainerDied","Data":"65a73678a5ad78e7585173de169f2d7a6d65b25e788a3d5d7c9fcee111fcd45d"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.599968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" event={"ID":"07bcd65e-dcf5-4778-b34d-fba3728e8616","Type":"ContainerStarted","Data":"6c15b32848a8063cbc3f97bf4a38f51a74e09189d69ccff71bfa86ad3aec6c24"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.604045 4743 generic.go:334] "Generic (PLEG): container finished" podID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerID="03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913" exitCode=143 Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.604112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerDied","Data":"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.605819 4743 generic.go:334] "Generic (PLEG): container finished" podID="1adbc0d6-0108-4060-9299-7d71187ac9e9" containerID="01f8d4d2a50491c06be70fe555b84c5e85bf74d4aed887ef308511035a4ffa62" exitCode=0 Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.605898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbj22" event={"ID":"1adbc0d6-0108-4060-9299-7d71187ac9e9","Type":"ContainerDied","Data":"01f8d4d2a50491c06be70fe555b84c5e85bf74d4aed887ef308511035a4ffa62"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.607735 4743 generic.go:334] "Generic (PLEG): container finished" podID="25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" containerID="c286824626c99fb36a15a3a6dff8a06ee8a98c682a4082ac3628d6edeedf93d5" exitCode=0 Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.607776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" event={"ID":"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004","Type":"ContainerDied","Data":"c286824626c99fb36a15a3a6dff8a06ee8a98c682a4082ac3628d6edeedf93d5"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.609010 4743 generic.go:334] "Generic (PLEG): container finished" podID="43e926de-d57c-4d5b-82a6-2a28f645f18d" containerID="3f0ef25b0f166c44d882e33caca448af05e5012c1c3d4874df3144745b40e14d" exitCode=0 Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.609049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-586d-account-create-update-nl5vc" event={"ID":"43e926de-d57c-4d5b-82a6-2a28f645f18d","Type":"ContainerDied","Data":"3f0ef25b0f166c44d882e33caca448af05e5012c1c3d4874df3144745b40e14d"} Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.610478 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.708404 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.716889 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.722450 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723518 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="proxy-httpd" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723535 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="proxy-httpd" Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723559 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="sg-core" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723591 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="sg-core" Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723605 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723611 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723623 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-central-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723629 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-central-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723646 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon-log" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723681 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon-log" Jan 22 14:05:37 crc kubenswrapper[4743]: E0122 14:05:37.723694 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-notification-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.723702 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-notification-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724292 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="proxy-httpd" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724319 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-notification-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724328 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="ceilometer-central-agent" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724339 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon-log" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724350 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" containerName="sg-core" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.724361 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" containerName="horizon" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.726340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.731495 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.731704 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.745744 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:48536->10.217.0.155:9292: read: connection reset by peer" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.745744 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:48520->10.217.0.155:9292: read: connection reset by peer" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.769489 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9884817b-7cf0-4495-8333-c5ee23796e4c" path="/var/lib/kubelet/pods/9884817b-7cf0-4495-8333-c5ee23796e4c/volumes" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.773441 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff52751-78f1-4c39-aa95-5d74a246151e" path="/var/lib/kubelet/pods/dff52751-78f1-4c39-aa95-5d74a246151e/volumes" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.774080 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5972\" (UniqueName: \"kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.778955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.880319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.880639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.880734 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.881316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.881878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.884842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.887170 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.889414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.890717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5972\" (UniqueName: \"kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.890858 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.890952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.898850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.907315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5972\" (UniqueName: \"kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:37 crc kubenswrapper[4743]: I0122 14:05:37.913440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " pod="openstack/ceilometer-0" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.052181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.218920 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.235229 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.301489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts\") pod \"f55bf894-5195-430c-acc8-06875cadcdff\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.301667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx56q\" (UniqueName: \"kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q\") pod \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.302016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr76f\" (UniqueName: \"kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f\") pod \"f55bf894-5195-430c-acc8-06875cadcdff\" (UID: \"f55bf894-5195-430c-acc8-06875cadcdff\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.302236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts\") pod \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\" (UID: \"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.302806 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f55bf894-5195-430c-acc8-06875cadcdff" (UID: "f55bf894-5195-430c-acc8-06875cadcdff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.302879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" (UID: "da15c36a-41ce-424b-a5bf-e6fec2d1b4c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.304173 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f55bf894-5195-430c-acc8-06875cadcdff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.304206 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.315313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f" (OuterVolumeSpecName: "kube-api-access-qr76f") pod "f55bf894-5195-430c-acc8-06875cadcdff" (UID: "f55bf894-5195-430c-acc8-06875cadcdff"). InnerVolumeSpecName "kube-api-access-qr76f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.322309 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.324878 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q" (OuterVolumeSpecName: "kube-api-access-dx56q") pod "da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" (UID: "da15c36a-41ce-424b-a5bf-e6fec2d1b4c7"). InnerVolumeSpecName "kube-api-access-dx56q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.405095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9vt6\" (UniqueName: \"kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.410027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.410325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.411238 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx56q\" (UniqueName: \"kubernetes.io/projected/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7-kube-api-access-dx56q\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.411346 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr76f\" (UniqueName: \"kubernetes.io/projected/f55bf894-5195-430c-acc8-06875cadcdff-kube-api-access-qr76f\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.441169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6" (OuterVolumeSpecName: "kube-api-access-s9vt6") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "kube-api-access-s9vt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.461272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts" (OuterVolumeSpecName: "scripts") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515308 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515587 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs\") pod \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\" (UID: \"126e7829-d6f7-4443-b4f6-02669ff5fbc7\") " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515954 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9vt6\" (UniqueName: \"kubernetes.io/projected/126e7829-d6f7-4443-b4f6-02669ff5fbc7-kube-api-access-s9vt6\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.515970 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.516325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs" (OuterVolumeSpecName: "logs") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.522936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.577996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.600299 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.620321 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.620378 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.620405 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.620415 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126e7829-d6f7-4443-b4f6-02669ff5fbc7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.678985 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.682774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bkr2b" event={"ID":"da15c36a-41ce-424b-a5bf-e6fec2d1b4c7","Type":"ContainerDied","Data":"1f69389e929a96c5371c6ab5626a4081f177df864e6649af4863f878c8ce5ca3"} Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.682828 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f69389e929a96c5371c6ab5626a4081f177df864e6649af4863f878c8ce5ca3" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.682912 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bkr2b" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.720382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data" (OuterVolumeSpecName: "config-data") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.722418 4743 generic.go:334] "Generic (PLEG): container finished" podID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerID="1ae6a9f97c3b7fc665caf057bbe7901477be00bed6bc5e379b53f6b4f6925f62" exitCode=0 Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.722554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerDied","Data":"1ae6a9f97c3b7fc665caf057bbe7901477be00bed6bc5e379b53f6b4f6925f62"} Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.722534 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "126e7829-d6f7-4443-b4f6-02669ff5fbc7" (UID: "126e7829-d6f7-4443-b4f6-02669ff5fbc7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.723155 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.738698 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.744300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hpj2q" event={"ID":"f55bf894-5195-430c-acc8-06875cadcdff","Type":"ContainerDied","Data":"9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571"} Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.744353 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c0d303e62b7e0d1741e939667972d6b1c50de24ecd3b707801c541b63837571" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.744392 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hpj2q" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.748214 4743 generic.go:334] "Generic (PLEG): container finished" podID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerID="29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7" exitCode=0 Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.748596 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.751992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerDied","Data":"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7"} Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.752057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126e7829-d6f7-4443-b4f6-02669ff5fbc7","Type":"ContainerDied","Data":"aa612fc06777abc239bb03ac9abf35e72999a6728284582ce589f9faacf6122a"} Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.752081 4743 scope.go:117] "RemoveContainer" containerID="29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.795418 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.809334 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.831422 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.831445 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/126e7829-d6f7-4443-b4f6-02669ff5fbc7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.855972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.856625 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-httpd" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856647 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-httpd" Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.856686 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856694 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.856706 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55bf894-5195-430c-acc8-06875cadcdff" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856713 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55bf894-5195-430c-acc8-06875cadcdff" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.856727 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-log" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856733 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-log" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856952 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-httpd" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856966 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.856983 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" containerName="glance-log" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.857008 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55bf894-5195-430c-acc8-06875cadcdff" containerName="mariadb-database-create" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.866639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.866747 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.869998 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.870834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.878882 4743 scope.go:117] "RemoveContainer" containerID="b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.948089 4743 scope.go:117] "RemoveContainer" containerID="29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7" Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.949264 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7\": container with ID starting with 29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7 not found: ID does not exist" containerID="29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.949361 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7"} err="failed to get container status \"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7\": rpc error: code = NotFound desc = could not find container \"29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7\": container with ID starting with 29c1e5e36d9d852d81dd4508b41dfb0eff77adb4114dae21485004a359a2f0f7 not found: ID does not exist" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.949385 4743 scope.go:117] "RemoveContainer" containerID="b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b" Jan 22 14:05:38 crc kubenswrapper[4743]: E0122 14:05:38.950255 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b\": container with ID starting with b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b not found: ID does not exist" containerID="b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.950305 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b"} err="failed to get container status \"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b\": rpc error: code = NotFound desc = could not find container \"b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b\": container with ID starting with b21504d9ca462caba9d729f866e841df9037b75bb1f78b2711e8c4e122634d6b not found: ID does not exist" Jan 22 14:05:38 crc kubenswrapper[4743]: I0122 14:05:38.996426 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.036973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037064 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-logs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scsdp\" (UniqueName: \"kubernetes.io/projected/5247bc1b-998e-4275-9f4a-d3c30ff488b9-kube-api-access-scsdp\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037145 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.037406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config\") pod \"de2c5f93-9ee3-4723-8123-bd48d5385423\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle\") pod \"de2c5f93-9ee3-4723-8123-bd48d5385423\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config\") pod \"de2c5f93-9ee3-4723-8123-bd48d5385423\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs\") pod \"de2c5f93-9ee3-4723-8123-bd48d5385423\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138608 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkqgw\" (UniqueName: \"kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw\") pod \"de2c5f93-9ee3-4723-8123-bd48d5385423\" (UID: \"de2c5f93-9ee3-4723-8123-bd48d5385423\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-logs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.138990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scsdp\" (UniqueName: \"kubernetes.io/projected/5247bc1b-998e-4275-9f4a-d3c30ff488b9-kube-api-access-scsdp\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.139007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.139030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.139715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-logs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.144519 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.147394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5247bc1b-998e-4275-9f4a-d3c30ff488b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.170849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.175626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.180279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw" (OuterVolumeSpecName: "kube-api-access-rkqgw") pod "de2c5f93-9ee3-4723-8123-bd48d5385423" (UID: "de2c5f93-9ee3-4723-8123-bd48d5385423"). InnerVolumeSpecName "kube-api-access-rkqgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.183037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.183210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5247bc1b-998e-4275-9f4a-d3c30ff488b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.186932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scsdp\" (UniqueName: \"kubernetes.io/projected/5247bc1b-998e-4275-9f4a-d3c30ff488b9-kube-api-access-scsdp\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.188302 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de2c5f93-9ee3-4723-8123-bd48d5385423" (UID: "de2c5f93-9ee3-4723-8123-bd48d5385423"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.223324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"5247bc1b-998e-4275-9f4a-d3c30ff488b9\") " pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.238183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config" (OuterVolumeSpecName: "config") pod "de2c5f93-9ee3-4723-8123-bd48d5385423" (UID: "de2c5f93-9ee3-4723-8123-bd48d5385423"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.243760 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.243957 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.244035 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkqgw\" (UniqueName: \"kubernetes.io/projected/de2c5f93-9ee3-4723-8123-bd48d5385423-kube-api-access-rkqgw\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.250673 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de2c5f93-9ee3-4723-8123-bd48d5385423" (UID: "de2c5f93-9ee3-4723-8123-bd48d5385423"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.266185 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.324465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de2c5f93-9ee3-4723-8123-bd48d5385423" (UID: "de2c5f93-9ee3-4723-8123-bd48d5385423"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.349567 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts\") pod \"1adbc0d6-0108-4060-9299-7d71187ac9e9\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.350048 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75b5z\" (UniqueName: \"kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z\") pod \"1adbc0d6-0108-4060-9299-7d71187ac9e9\" (UID: \"1adbc0d6-0108-4060-9299-7d71187ac9e9\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.350847 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.350989 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2c5f93-9ee3-4723-8123-bd48d5385423-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.352531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1adbc0d6-0108-4060-9299-7d71187ac9e9" (UID: "1adbc0d6-0108-4060-9299-7d71187ac9e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.354278 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.354636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z" (OuterVolumeSpecName: "kube-api-access-75b5z") pod "1adbc0d6-0108-4060-9299-7d71187ac9e9" (UID: "1adbc0d6-0108-4060-9299-7d71187ac9e9"). InnerVolumeSpecName "kube-api-access-75b5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.398893 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.410515 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.462282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsg5t\" (UniqueName: \"kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t\") pod \"07bcd65e-dcf5-4778-b34d-fba3728e8616\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.462995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts\") pod \"07bcd65e-dcf5-4778-b34d-fba3728e8616\" (UID: \"07bcd65e-dcf5-4778-b34d-fba3728e8616\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.463818 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75b5z\" (UniqueName: \"kubernetes.io/projected/1adbc0d6-0108-4060-9299-7d71187ac9e9-kube-api-access-75b5z\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.463843 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adbc0d6-0108-4060-9299-7d71187ac9e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.465284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07bcd65e-dcf5-4778-b34d-fba3728e8616" (UID: "07bcd65e-dcf5-4778-b34d-fba3728e8616"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.482240 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t" (OuterVolumeSpecName: "kube-api-access-xsg5t") pod "07bcd65e-dcf5-4778-b34d-fba3728e8616" (UID: "07bcd65e-dcf5-4778-b34d-fba3728e8616"). InnerVolumeSpecName "kube-api-access-xsg5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.498111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.512774 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.564992 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts\") pod \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.565110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts\") pod \"43e926de-d57c-4d5b-82a6-2a28f645f18d\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.565871 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" (UID: "25cb1f70-dfe0-422d-95f5-b4e7ea4a8004"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43e926de-d57c-4d5b-82a6-2a28f645f18d" (UID: "43e926de-d57c-4d5b-82a6-2a28f645f18d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rl9z\" (UniqueName: \"kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z\") pod \"43e926de-d57c-4d5b-82a6-2a28f645f18d\" (UID: \"43e926de-d57c-4d5b-82a6-2a28f645f18d\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9lqh\" (UniqueName: \"kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh\") pod \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\" (UID: \"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004\") " Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566711 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsg5t\" (UniqueName: \"kubernetes.io/projected/07bcd65e-dcf5-4778-b34d-fba3728e8616-kube-api-access-xsg5t\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566727 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bcd65e-dcf5-4778-b34d-fba3728e8616-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566738 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.566749 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e926de-d57c-4d5b-82a6-2a28f645f18d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.571841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z" (OuterVolumeSpecName: "kube-api-access-4rl9z") pod "43e926de-d57c-4d5b-82a6-2a28f645f18d" (UID: "43e926de-d57c-4d5b-82a6-2a28f645f18d"). InnerVolumeSpecName "kube-api-access-4rl9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.571897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh" (OuterVolumeSpecName: "kube-api-access-q9lqh") pod "25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" (UID: "25cb1f70-dfe0-422d-95f5-b4e7ea4a8004"). InnerVolumeSpecName "kube-api-access-q9lqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.668142 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rl9z\" (UniqueName: \"kubernetes.io/projected/43e926de-d57c-4d5b-82a6-2a28f645f18d-kube-api-access-4rl9z\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.668529 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9lqh\" (UniqueName: \"kubernetes.io/projected/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004-kube-api-access-q9lqh\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.765446 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126e7829-d6f7-4443-b4f6-02669ff5fbc7" path="/var/lib/kubelet/pods/126e7829-d6f7-4443-b4f6-02669ff5fbc7/volumes" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.803414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" event={"ID":"25cb1f70-dfe0-422d-95f5-b4e7ea4a8004","Type":"ContainerDied","Data":"0c893ee0a0003e4ee822dfcebc0af5b534c1f98e96f11259026505182617dc4a"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.803447 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c893ee0a0003e4ee822dfcebc0af5b534c1f98e96f11259026505182617dc4a" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.803497 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec5d-account-create-update-p4vn8" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.807521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-586d-account-create-update-nl5vc" event={"ID":"43e926de-d57c-4d5b-82a6-2a28f645f18d","Type":"ContainerDied","Data":"1775d43f2df17c12debd2b3a4f097de720575e178b7c0b0e70ca3490e39d6a9c"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.807547 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1775d43f2df17c12debd2b3a4f097de720575e178b7c0b0e70ca3490e39d6a9c" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.807590 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-586d-account-create-update-nl5vc" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.819980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" event={"ID":"07bcd65e-dcf5-4778-b34d-fba3728e8616","Type":"ContainerDied","Data":"6c15b32848a8063cbc3f97bf4a38f51a74e09189d69ccff71bfa86ad3aec6c24"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.820017 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c15b32848a8063cbc3f97bf4a38f51a74e09189d69ccff71bfa86ad3aec6c24" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.820075 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7a10-account-create-update-7hg8n" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.827959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerStarted","Data":"6cd511dcfbc7fd8dfd3bf143456fc33eef471995bfd0118b9fbcefec9218a70b"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.833307 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bd7ccdcfb-sx7wp" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.834126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bd7ccdcfb-sx7wp" event={"ID":"de2c5f93-9ee3-4723-8123-bd48d5385423","Type":"ContainerDied","Data":"06ab1b8cdf15dd2c1398403a1308fc09babef387e4655f4d02634938d466194a"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.834171 4743 scope.go:117] "RemoveContainer" containerID="ef9ffc5cb4c24c27363689c60af953e2196889a603ec04ab66b8826de1d12f2b" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.844969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbj22" event={"ID":"1adbc0d6-0108-4060-9299-7d71187ac9e9","Type":"ContainerDied","Data":"4f4591e555f7f6a3c29d9c280f31b43d719868a263282eac92c0b014205a3b70"} Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.845011 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f4591e555f7f6a3c29d9c280f31b43d719868a263282eac92c0b014205a3b70" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.845061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbj22" Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.953258 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.959929 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bd7ccdcfb-sx7wp"] Jan 22 14:05:39 crc kubenswrapper[4743]: I0122 14:05:39.963223 4743 scope.go:117] "RemoveContainer" containerID="1ae6a9f97c3b7fc665caf057bbe7901477be00bed6bc5e379b53f6b4f6925f62" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.152488 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 22 14:05:40 crc kubenswrapper[4743]: W0122 14:05:40.156875 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5247bc1b_998e_4275_9f4a_d3c30ff488b9.slice/crio-e247ac28893c2b9135dd357035789979e4bbeac74377a11d88310300e832592a WatchSource:0}: Error finding container e247ac28893c2b9135dd357035789979e4bbeac74377a11d88310300e832592a: Status 404 returned error can't find the container with id e247ac28893c2b9135dd357035789979e4bbeac74377a11d88310300e832592a Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.581172 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.695904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.695978 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696136 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696190 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49g9s\" (UniqueName: \"kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s\") pod \"eefba4cb-766b-45a4-b832-83c9ef83a30b\" (UID: \"eefba4cb-766b-45a4-b832-83c9ef83a30b\") " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.696499 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.697234 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.702041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.708949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts" (OuterVolumeSpecName: "scripts") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.709100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s" (OuterVolumeSpecName: "kube-api-access-49g9s") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "kube-api-access-49g9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.719342 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs" (OuterVolumeSpecName: "logs") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.778250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.784021 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799257 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799304 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799314 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eefba4cb-766b-45a4-b832-83c9ef83a30b-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799324 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49g9s\" (UniqueName: \"kubernetes.io/projected/eefba4cb-766b-45a4-b832-83c9ef83a30b-kube-api-access-49g9s\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799334 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.799342 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.823815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data" (OuterVolumeSpecName: "config-data") pod "eefba4cb-766b-45a4-b832-83c9ef83a30b" (UID: "eefba4cb-766b-45a4-b832-83c9ef83a30b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.824412 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.856267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5247bc1b-998e-4275-9f4a-d3c30ff488b9","Type":"ContainerStarted","Data":"e247ac28893c2b9135dd357035789979e4bbeac74377a11d88310300e832592a"} Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.866105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerStarted","Data":"b618f20ebbcd305064322e20b0e70b0150d02041263ed6f0b634771605c316f9"} Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.866155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerStarted","Data":"54c784e1c7c4455b82fc0913b353d81f958210df7650ee3d2761b2a101b6da45"} Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.870821 4743 generic.go:334] "Generic (PLEG): container finished" podID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerID="13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02" exitCode=0 Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.870867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerDied","Data":"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02"} Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.870897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eefba4cb-766b-45a4-b832-83c9ef83a30b","Type":"ContainerDied","Data":"9fe88db94dc3b2ab9c18169846d9b1cbf71d9b4f9bf6fd1cf8ec87a1afbf44c0"} Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.870918 4743 scope.go:117] "RemoveContainer" containerID="13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.870952 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.901212 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eefba4cb-766b-45a4-b832-83c9ef83a30b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.901243 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.912524 4743 scope.go:117] "RemoveContainer" containerID="03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.921642 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.930513 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946104 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946520 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e926de-d57c-4d5b-82a6-2a28f645f18d" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946545 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e926de-d57c-4d5b-82a6-2a28f645f18d" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946556 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-api" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946564 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-api" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946578 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946586 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946595 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adbc0d6-0108-4060-9299-7d71187ac9e9" containerName="mariadb-database-create" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946602 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adbc0d6-0108-4060-9299-7d71187ac9e9" containerName="mariadb-database-create" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946616 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946624 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946658 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-log" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946666 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-log" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946684 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07bcd65e-dcf5-4778-b34d-fba3728e8616" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946691 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="07bcd65e-dcf5-4778-b34d-fba3728e8616" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.946707 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946716 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946928 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946946 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-api" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-log" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946974 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e926de-d57c-4d5b-82a6-2a28f645f18d" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946984 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" containerName="neutron-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.946993 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" containerName="glance-httpd" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.947004 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="07bcd65e-dcf5-4778-b34d-fba3728e8616" containerName="mariadb-account-create-update" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.947019 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adbc0d6-0108-4060-9299-7d71187ac9e9" containerName="mariadb-database-create" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.948152 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.951226 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.951389 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.993688 4743 scope.go:117] "RemoveContainer" containerID="13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.996464 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02\": container with ID starting with 13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02 not found: ID does not exist" containerID="13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.996521 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02"} err="failed to get container status \"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02\": rpc error: code = NotFound desc = could not find container \"13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02\": container with ID starting with 13483d1ae113b97ff4fc3135d303101705bda4cbbb52409c6e7360413f249d02 not found: ID does not exist" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.996555 4743 scope.go:117] "RemoveContainer" containerID="03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913" Jan 22 14:05:40 crc kubenswrapper[4743]: E0122 14:05:40.997156 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913\": container with ID starting with 03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913 not found: ID does not exist" containerID="03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913" Jan 22 14:05:40 crc kubenswrapper[4743]: I0122 14:05:40.997185 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913"} err="failed to get container status \"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913\": rpc error: code = NotFound desc = could not find container \"03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913\": container with ID starting with 03296458679a7c6b8b657b085da182ca7c385ecb2b24f18d062f776e2e9d1913 not found: ID does not exist" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.003373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.105892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hvj8\" (UniqueName: \"kubernetes.io/projected/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-kube-api-access-5hvj8\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.105951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.106203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hvj8\" (UniqueName: \"kubernetes.io/projected/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-kube-api-access-5hvj8\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.208521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.209300 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.209878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.209887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.212897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.214783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.215513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.219506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.231253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hvj8\" (UniqueName: \"kubernetes.io/projected/6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb-kube-api-access-5hvj8\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.252745 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb\") " pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.289217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.762622 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2c5f93-9ee3-4723-8123-bd48d5385423" path="/var/lib/kubelet/pods/de2c5f93-9ee3-4723-8123-bd48d5385423/volumes" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.763591 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eefba4cb-766b-45a4-b832-83c9ef83a30b" path="/var/lib/kubelet/pods/eefba4cb-766b-45a4-b832-83c9ef83a30b/volumes" Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.819912 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.954937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5247bc1b-998e-4275-9f4a-d3c30ff488b9","Type":"ContainerStarted","Data":"cc738139c1584ef118baa4bdbe5209ed45845fa629a2d5097ace6c290f247315"} Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.955019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5247bc1b-998e-4275-9f4a-d3c30ff488b9","Type":"ContainerStarted","Data":"32de8ac53b854ec23555d9e31df7f1d5b1b4276c0341cdcf53fef788267bf274"} Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.959928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb","Type":"ContainerStarted","Data":"2feceb20cd78b20bb7b99303576a939605ec39cab2e16b31c7c19309920a1834"} Jan 22 14:05:41 crc kubenswrapper[4743]: I0122 14:05:41.991582 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.991564757 podStartE2EDuration="3.991564757s" podCreationTimestamp="2026-01-22 14:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:41.982417921 +0000 UTC m=+1178.537461084" watchObservedRunningTime="2026-01-22 14:05:41.991564757 +0000 UTC m=+1178.546607920" Jan 22 14:05:42 crc kubenswrapper[4743]: I0122 14:05:42.971424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerStarted","Data":"a2e9ae6d0c224c20c2811b4fd87fb87f2710069ffebb91ab038109f8a36da85b"} Jan 22 14:05:42 crc kubenswrapper[4743]: I0122 14:05:42.977902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb","Type":"ContainerStarted","Data":"f142c15491eff62cd2f7b85339645f91172cb713b54fbbf65ef9079470dce024"} Jan 22 14:05:43 crc kubenswrapper[4743]: I0122 14:05:43.989558 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb","Type":"ContainerStarted","Data":"558d3f6a974b003ebe56334d68ed65ea2b5ef1703bb97e3b06870187ee011aef"} Jan 22 14:05:44 crc kubenswrapper[4743]: I0122 14:05:44.020396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.020372986 podStartE2EDuration="4.020372986s" podCreationTimestamp="2026-01-22 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:44.015549321 +0000 UTC m=+1180.570592494" watchObservedRunningTime="2026-01-22 14:05:44.020372986 +0000 UTC m=+1180.575416159" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:44.999881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerStarted","Data":"db957e51e5458768aa95b89b102c4de4196179aea8fdbaff1cd32af07531155c"} Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.000199 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="proxy-httpd" containerID="cri-o://db957e51e5458768aa95b89b102c4de4196179aea8fdbaff1cd32af07531155c" gracePeriod=30 Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.000232 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-notification-agent" containerID="cri-o://b618f20ebbcd305064322e20b0e70b0150d02041263ed6f0b634771605c316f9" gracePeriod=30 Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.000078 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-central-agent" containerID="cri-o://54c784e1c7c4455b82fc0913b353d81f958210df7650ee3d2761b2a101b6da45" gracePeriod=30 Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.000167 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="sg-core" containerID="cri-o://a2e9ae6d0c224c20c2811b4fd87fb87f2710069ffebb91ab038109f8a36da85b" gracePeriod=30 Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.027742 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.996217891 podStartE2EDuration="8.027723309s" podCreationTimestamp="2026-01-22 14:05:37 +0000 UTC" firstStartedPulling="2026-01-22 14:05:38.71165758 +0000 UTC m=+1175.266700743" lastFinishedPulling="2026-01-22 14:05:43.743162998 +0000 UTC m=+1180.298206161" observedRunningTime="2026-01-22 14:05:45.023611233 +0000 UTC m=+1181.578654406" watchObservedRunningTime="2026-01-22 14:05:45.027723309 +0000 UTC m=+1181.582766472" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.386836 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hn577"] Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.392481 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.394818 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.397028 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nbvpm" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.397193 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.412181 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hn577"] Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.542017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.542110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtmz\" (UniqueName: \"kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.542132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.542286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.644130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbtmz\" (UniqueName: \"kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.644239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.644297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.644427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.650596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.655327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.656554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.661966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbtmz\" (UniqueName: \"kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz\") pod \"nova-cell0-conductor-db-sync-hn577\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:45 crc kubenswrapper[4743]: I0122 14:05:45.767251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014748 4743 generic.go:334] "Generic (PLEG): container finished" podID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerID="db957e51e5458768aa95b89b102c4de4196179aea8fdbaff1cd32af07531155c" exitCode=0 Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014800 4743 generic.go:334] "Generic (PLEG): container finished" podID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerID="a2e9ae6d0c224c20c2811b4fd87fb87f2710069ffebb91ab038109f8a36da85b" exitCode=2 Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014811 4743 generic.go:334] "Generic (PLEG): container finished" podID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerID="b618f20ebbcd305064322e20b0e70b0150d02041263ed6f0b634771605c316f9" exitCode=0 Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerDied","Data":"db957e51e5458768aa95b89b102c4de4196179aea8fdbaff1cd32af07531155c"} Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerDied","Data":"a2e9ae6d0c224c20c2811b4fd87fb87f2710069ffebb91ab038109f8a36da85b"} Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.014877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerDied","Data":"b618f20ebbcd305064322e20b0e70b0150d02041263ed6f0b634771605c316f9"} Jan 22 14:05:46 crc kubenswrapper[4743]: I0122 14:05:46.224930 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hn577"] Jan 22 14:05:46 crc kubenswrapper[4743]: W0122 14:05:46.228848 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce464499_6235_4e9e_b2ef_02dcc568f613.slice/crio-baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260 WatchSource:0}: Error finding container baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260: Status 404 returned error can't find the container with id baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260 Jan 22 14:05:47 crc kubenswrapper[4743]: I0122 14:05:47.035481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hn577" event={"ID":"ce464499-6235-4e9e-b2ef-02dcc568f613","Type":"ContainerStarted","Data":"baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260"} Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.048484 4743 generic.go:334] "Generic (PLEG): container finished" podID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerID="54c784e1c7c4455b82fc0913b353d81f958210df7650ee3d2761b2a101b6da45" exitCode=0 Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.048575 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerDied","Data":"54c784e1c7c4455b82fc0913b353d81f958210df7650ee3d2761b2a101b6da45"} Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.185554 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289596 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289732 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5972\" (UniqueName: \"kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.289818 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd\") pod \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\" (UID: \"6838f1b4-d5cf-45a8-918b-a54911cbe4c6\") " Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.290356 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.290662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.295382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts" (OuterVolumeSpecName: "scripts") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.301950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972" (OuterVolumeSpecName: "kube-api-access-k5972") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "kube-api-access-k5972". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.329836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.392592 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.392624 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.392633 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.392644 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5972\" (UniqueName: \"kubernetes.io/projected/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-kube-api-access-k5972\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.392655 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.406996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data" (OuterVolumeSpecName: "config-data") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.416483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6838f1b4-d5cf-45a8-918b-a54911cbe4c6" (UID: "6838f1b4-d5cf-45a8-918b-a54911cbe4c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.494718 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:48 crc kubenswrapper[4743]: I0122 14:05:48.494753 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838f1b4-d5cf-45a8-918b-a54911cbe4c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.062272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6838f1b4-d5cf-45a8-918b-a54911cbe4c6","Type":"ContainerDied","Data":"6cd511dcfbc7fd8dfd3bf143456fc33eef471995bfd0118b9fbcefec9218a70b"} Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.062349 4743 scope.go:117] "RemoveContainer" containerID="db957e51e5458768aa95b89b102c4de4196179aea8fdbaff1cd32af07531155c" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.062410 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.100684 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.115173 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.124292 4743 scope.go:117] "RemoveContainer" containerID="a2e9ae6d0c224c20c2811b4fd87fb87f2710069ffebb91ab038109f8a36da85b" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.125918 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:49 crc kubenswrapper[4743]: E0122 14:05:49.126259 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-central-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126277 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-central-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: E0122 14:05:49.126300 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="proxy-httpd" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126308 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="proxy-httpd" Jan 22 14:05:49 crc kubenswrapper[4743]: E0122 14:05:49.126324 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="sg-core" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126330 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="sg-core" Jan 22 14:05:49 crc kubenswrapper[4743]: E0122 14:05:49.126344 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-notification-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126350 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-notification-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126501 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="sg-core" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126523 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-central-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126536 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="proxy-httpd" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.126550 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" containerName="ceilometer-notification-agent" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.128371 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.131479 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.131685 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.138223 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.193313 4743 scope.go:117] "RemoveContainer" containerID="b618f20ebbcd305064322e20b0e70b0150d02041263ed6f0b634771605c316f9" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.213933 4743 scope.go:117] "RemoveContainer" containerID="54c784e1c7c4455b82fc0913b353d81f958210df7650ee3d2761b2a101b6da45" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217495 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4wmq\" (UniqueName: \"kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.217864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.218064 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.318783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4wmq\" (UniqueName: \"kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.319567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.320826 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.324339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.324757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.325058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.339184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.350048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4wmq\" (UniqueName: \"kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq\") pod \"ceilometer-0\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.498721 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.498776 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.509147 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.553331 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.555419 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.762186 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6838f1b4-d5cf-45a8-918b-a54911cbe4c6" path="/var/lib/kubelet/pods/6838f1b4-d5cf-45a8-918b-a54911cbe4c6/volumes" Jan 22 14:05:49 crc kubenswrapper[4743]: I0122 14:05:49.996816 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:05:50 crc kubenswrapper[4743]: I0122 14:05:50.071964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 22 14:05:50 crc kubenswrapper[4743]: I0122 14:05:50.072002 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 22 14:05:51 crc kubenswrapper[4743]: I0122 14:05:51.289753 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:51 crc kubenswrapper[4743]: I0122 14:05:51.290251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:51 crc kubenswrapper[4743]: I0122 14:05:51.354511 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:51 crc kubenswrapper[4743]: I0122 14:05:51.358835 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:52 crc kubenswrapper[4743]: I0122 14:05:52.086757 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:52 crc kubenswrapper[4743]: I0122 14:05:52.087073 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:52 crc kubenswrapper[4743]: I0122 14:05:52.275926 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 22 14:05:52 crc kubenswrapper[4743]: I0122 14:05:52.276029 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:05:52 crc kubenswrapper[4743]: I0122 14:05:52.419136 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 22 14:05:54 crc kubenswrapper[4743]: I0122 14:05:54.129466 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:54 crc kubenswrapper[4743]: I0122 14:05:54.129874 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:05:54 crc kubenswrapper[4743]: I0122 14:05:54.300285 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 22 14:05:56 crc kubenswrapper[4743]: I0122 14:05:56.124871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hn577" event={"ID":"ce464499-6235-4e9e-b2ef-02dcc568f613","Type":"ContainerStarted","Data":"7a7f8a69879f301768b68f718413117c84dd5bc97ec0ad8caa39ec95a947b452"} Jan 22 14:05:56 crc kubenswrapper[4743]: I0122 14:05:56.128665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerStarted","Data":"0c2acad13305fec5bdac54a5b540d9dccb4e909a0de2f1f07026510d7b8cceb7"} Jan 22 14:05:56 crc kubenswrapper[4743]: I0122 14:05:56.142313 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hn577" podStartSLOduration=1.617460688 podStartE2EDuration="11.142293488s" podCreationTimestamp="2026-01-22 14:05:45 +0000 UTC" firstStartedPulling="2026-01-22 14:05:46.231854935 +0000 UTC m=+1182.786898108" lastFinishedPulling="2026-01-22 14:05:55.756687745 +0000 UTC m=+1192.311730908" observedRunningTime="2026-01-22 14:05:56.140271786 +0000 UTC m=+1192.695314949" watchObservedRunningTime="2026-01-22 14:05:56.142293488 +0000 UTC m=+1192.697336661" Jan 22 14:05:57 crc kubenswrapper[4743]: I0122 14:05:57.144742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerStarted","Data":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} Jan 22 14:05:58 crc kubenswrapper[4743]: I0122 14:05:58.157184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerStarted","Data":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} Jan 22 14:05:59 crc kubenswrapper[4743]: I0122 14:05:59.170501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerStarted","Data":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} Jan 22 14:06:01 crc kubenswrapper[4743]: I0122 14:06:01.191457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerStarted","Data":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} Jan 22 14:06:01 crc kubenswrapper[4743]: I0122 14:06:01.196057 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:06:01 crc kubenswrapper[4743]: I0122 14:06:01.221231 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.885044817 podStartE2EDuration="12.2205115s" podCreationTimestamp="2026-01-22 14:05:49 +0000 UTC" firstStartedPulling="2026-01-22 14:05:55.685578602 +0000 UTC m=+1192.240621765" lastFinishedPulling="2026-01-22 14:06:00.021045285 +0000 UTC m=+1196.576088448" observedRunningTime="2026-01-22 14:06:01.211852037 +0000 UTC m=+1197.766895200" watchObservedRunningTime="2026-01-22 14:06:01.2205115 +0000 UTC m=+1197.775554663" Jan 22 14:06:06 crc kubenswrapper[4743]: I0122 14:06:06.504760 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:06 crc kubenswrapper[4743]: I0122 14:06:06.505557 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-central-agent" containerID="cri-o://b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" gracePeriod=30 Jan 22 14:06:06 crc kubenswrapper[4743]: I0122 14:06:06.506070 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="proxy-httpd" containerID="cri-o://3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" gracePeriod=30 Jan 22 14:06:06 crc kubenswrapper[4743]: I0122 14:06:06.506154 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-notification-agent" containerID="cri-o://7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" gracePeriod=30 Jan 22 14:06:06 crc kubenswrapper[4743]: I0122 14:06:06.506274 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="sg-core" containerID="cri-o://f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" gracePeriod=30 Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.244055 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252250 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" exitCode=0 Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252280 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" exitCode=2 Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252288 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" exitCode=0 Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerDied","Data":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252326 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252298 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" exitCode=0 Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252352 4743 scope.go:117] "RemoveContainer" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerDied","Data":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerDied","Data":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerDied","Data":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.252448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd53bb4c-3485-4ae2-b5d9-a197d017363d","Type":"ContainerDied","Data":"0c2acad13305fec5bdac54a5b540d9dccb4e909a0de2f1f07026510d7b8cceb7"} Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.276809 4743 scope.go:117] "RemoveContainer" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.302395 4743 scope.go:117] "RemoveContainer" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4wmq\" (UniqueName: \"kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314064 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314104 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.314303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd\") pod \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\" (UID: \"dd53bb4c-3485-4ae2-b5d9-a197d017363d\") " Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.315019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.315140 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.321598 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts" (OuterVolumeSpecName: "scripts") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.321960 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq" (OuterVolumeSpecName: "kube-api-access-d4wmq") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "kube-api-access-d4wmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.325151 4743 scope.go:117] "RemoveContainer" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.339782 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.343832 4743 scope.go:117] "RemoveContainer" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.344273 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": container with ID starting with 3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98 not found: ID does not exist" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.344455 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} err="failed to get container status \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": rpc error: code = NotFound desc = could not find container \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": container with ID starting with 3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.344555 4743 scope.go:117] "RemoveContainer" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.344925 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": container with ID starting with f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139 not found: ID does not exist" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.345025 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} err="failed to get container status \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": rpc error: code = NotFound desc = could not find container \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": container with ID starting with f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.345113 4743 scope.go:117] "RemoveContainer" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.346074 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": container with ID starting with 7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50 not found: ID does not exist" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.346108 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} err="failed to get container status \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": rpc error: code = NotFound desc = could not find container \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": container with ID starting with 7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.346127 4743 scope.go:117] "RemoveContainer" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.346468 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": container with ID starting with b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea not found: ID does not exist" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.346563 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} err="failed to get container status \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": rpc error: code = NotFound desc = could not find container \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": container with ID starting with b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.346652 4743 scope.go:117] "RemoveContainer" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347109 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} err="failed to get container status \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": rpc error: code = NotFound desc = could not find container \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": container with ID starting with 3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347130 4743 scope.go:117] "RemoveContainer" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347364 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} err="failed to get container status \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": rpc error: code = NotFound desc = could not find container \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": container with ID starting with f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347388 4743 scope.go:117] "RemoveContainer" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347589 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} err="failed to get container status \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": rpc error: code = NotFound desc = could not find container \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": container with ID starting with 7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347611 4743 scope.go:117] "RemoveContainer" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347952 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} err="failed to get container status \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": rpc error: code = NotFound desc = could not find container \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": container with ID starting with b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.347974 4743 scope.go:117] "RemoveContainer" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348195 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} err="failed to get container status \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": rpc error: code = NotFound desc = could not find container \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": container with ID starting with 3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348218 4743 scope.go:117] "RemoveContainer" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348420 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} err="failed to get container status \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": rpc error: code = NotFound desc = could not find container \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": container with ID starting with f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348484 4743 scope.go:117] "RemoveContainer" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348710 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} err="failed to get container status \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": rpc error: code = NotFound desc = could not find container \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": container with ID starting with 7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348731 4743 scope.go:117] "RemoveContainer" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.348985 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} err="failed to get container status \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": rpc error: code = NotFound desc = could not find container \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": container with ID starting with b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349008 4743 scope.go:117] "RemoveContainer" containerID="3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349195 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98"} err="failed to get container status \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": rpc error: code = NotFound desc = could not find container \"3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98\": container with ID starting with 3d843ecd5660c29dc182178c0b132484fd4962392606dec42fa8114edb203c98 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349218 4743 scope.go:117] "RemoveContainer" containerID="f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349386 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139"} err="failed to get container status \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": rpc error: code = NotFound desc = could not find container \"f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139\": container with ID starting with f7c278c10ee4320f0ba0221fdb483926028290538935f3d241a86fc2e6515139 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349409 4743 scope.go:117] "RemoveContainer" containerID="7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349631 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50"} err="failed to get container status \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": rpc error: code = NotFound desc = could not find container \"7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50\": container with ID starting with 7b2018bc13222a70f8299ee95b17b934b4d25293a1fe49aacce669cb67a2aa50 not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349652 4743 scope.go:117] "RemoveContainer" containerID="b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.349837 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea"} err="failed to get container status \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": rpc error: code = NotFound desc = could not find container \"b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea\": container with ID starting with b7a7800725502a485d9848ecef700d423482d076ccad3c647c6729d4ae0005ea not found: ID does not exist" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.383118 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.412328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data" (OuterVolumeSpecName: "config-data") pod "dd53bb4c-3485-4ae2-b5d9-a197d017363d" (UID: "dd53bb4c-3485-4ae2-b5d9-a197d017363d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416548 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416584 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416601 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416616 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd53bb4c-3485-4ae2-b5d9-a197d017363d-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416627 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416639 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4wmq\" (UniqueName: \"kubernetes.io/projected/dd53bb4c-3485-4ae2-b5d9-a197d017363d-kube-api-access-d4wmq\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.416652 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd53bb4c-3485-4ae2-b5d9-a197d017363d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.590896 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.601127 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.630714 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.631142 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-central-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631162 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-central-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.631172 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="sg-core" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631178 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="sg-core" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.631190 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="proxy-httpd" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631198 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="proxy-httpd" Jan 22 14:06:07 crc kubenswrapper[4743]: E0122 14:06:07.631230 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-notification-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631235 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-notification-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631391 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-central-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631405 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="ceilometer-notification-agent" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631419 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="proxy-httpd" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.631435 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" containerName="sg-core" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.633041 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.634995 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.635640 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.640970 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.723008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.723608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gh9m\" (UniqueName: \"kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.723830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.723945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.724141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.724312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.724417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.757341 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd53bb4c-3485-4ae2-b5d9-a197d017363d" path="/var/lib/kubelet/pods/dd53bb4c-3485-4ae2-b5d9-a197d017363d/volumes" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gh9m\" (UniqueName: \"kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.826997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.827288 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.831019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.831870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.832020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.832637 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.844766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gh9m\" (UniqueName: \"kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m\") pod \"ceilometer-0\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " pod="openstack/ceilometer-0" Jan 22 14:06:07 crc kubenswrapper[4743]: I0122 14:06:07.997709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:08 crc kubenswrapper[4743]: W0122 14:06:08.439643 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af4ea6f_e824_4b5a_925c_699d0d342d5b.slice/crio-eb6dbb54f0f1365355e5259c86caa77e1749322f54e0c6fd41f4a1ce565fabdf WatchSource:0}: Error finding container eb6dbb54f0f1365355e5259c86caa77e1749322f54e0c6fd41f4a1ce565fabdf: Status 404 returned error can't find the container with id eb6dbb54f0f1365355e5259c86caa77e1749322f54e0c6fd41f4a1ce565fabdf Jan 22 14:06:08 crc kubenswrapper[4743]: I0122 14:06:08.439687 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:09 crc kubenswrapper[4743]: I0122 14:06:09.273481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerStarted","Data":"8fbb5299db2bcf722b02a3d31129df8c08d7c2c8ede399481905d93f9e3f69dc"} Jan 22 14:06:09 crc kubenswrapper[4743]: I0122 14:06:09.273836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerStarted","Data":"eb6dbb54f0f1365355e5259c86caa77e1749322f54e0c6fd41f4a1ce565fabdf"} Jan 22 14:06:10 crc kubenswrapper[4743]: I0122 14:06:10.286860 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerStarted","Data":"2532b8c4498f157815ce844d739c6429f24622ee8533293b97bfb332b84a6f51"} Jan 22 14:06:10 crc kubenswrapper[4743]: I0122 14:06:10.289528 4743 generic.go:334] "Generic (PLEG): container finished" podID="ce464499-6235-4e9e-b2ef-02dcc568f613" containerID="7a7f8a69879f301768b68f718413117c84dd5bc97ec0ad8caa39ec95a947b452" exitCode=0 Jan 22 14:06:10 crc kubenswrapper[4743]: I0122 14:06:10.289555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hn577" event={"ID":"ce464499-6235-4e9e-b2ef-02dcc568f613","Type":"ContainerDied","Data":"7a7f8a69879f301768b68f718413117c84dd5bc97ec0ad8caa39ec95a947b452"} Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.303991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerStarted","Data":"64e8eed1a4fa3ea055afc93b12f350be61e7bc8bc6755eab54e90b3fa299429f"} Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.580960 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.700817 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data\") pod \"ce464499-6235-4e9e-b2ef-02dcc568f613\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.700866 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle\") pod \"ce464499-6235-4e9e-b2ef-02dcc568f613\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.700975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbtmz\" (UniqueName: \"kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz\") pod \"ce464499-6235-4e9e-b2ef-02dcc568f613\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.701158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts\") pod \"ce464499-6235-4e9e-b2ef-02dcc568f613\" (UID: \"ce464499-6235-4e9e-b2ef-02dcc568f613\") " Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.705987 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts" (OuterVolumeSpecName: "scripts") pod "ce464499-6235-4e9e-b2ef-02dcc568f613" (UID: "ce464499-6235-4e9e-b2ef-02dcc568f613"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.706914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz" (OuterVolumeSpecName: "kube-api-access-tbtmz") pod "ce464499-6235-4e9e-b2ef-02dcc568f613" (UID: "ce464499-6235-4e9e-b2ef-02dcc568f613"). InnerVolumeSpecName "kube-api-access-tbtmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.729369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce464499-6235-4e9e-b2ef-02dcc568f613" (UID: "ce464499-6235-4e9e-b2ef-02dcc568f613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.734757 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data" (OuterVolumeSpecName: "config-data") pod "ce464499-6235-4e9e-b2ef-02dcc568f613" (UID: "ce464499-6235-4e9e-b2ef-02dcc568f613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.803348 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.803373 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.803382 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce464499-6235-4e9e-b2ef-02dcc568f613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:11 crc kubenswrapper[4743]: I0122 14:06:11.803392 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbtmz\" (UniqueName: \"kubernetes.io/projected/ce464499-6235-4e9e-b2ef-02dcc568f613-kube-api-access-tbtmz\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.315618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hn577" event={"ID":"ce464499-6235-4e9e-b2ef-02dcc568f613","Type":"ContainerDied","Data":"baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260"} Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.315674 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa3adfb93565005767f2721d758f7753ad54af24c3a66d3082cf2db42a56260" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.315749 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hn577" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.433663 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 22 14:06:12 crc kubenswrapper[4743]: E0122 14:06:12.434271 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce464499-6235-4e9e-b2ef-02dcc568f613" containerName="nova-cell0-conductor-db-sync" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.434366 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce464499-6235-4e9e-b2ef-02dcc568f613" containerName="nova-cell0-conductor-db-sync" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.434568 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce464499-6235-4e9e-b2ef-02dcc568f613" containerName="nova-cell0-conductor-db-sync" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.435164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.438471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nbvpm" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.438731 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.447370 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.517636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68vw\" (UniqueName: \"kubernetes.io/projected/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-kube-api-access-l68vw\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.518187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.518223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.619447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.619723 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.619907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68vw\" (UniqueName: \"kubernetes.io/projected/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-kube-api-access-l68vw\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.624155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.633755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.639736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68vw\" (UniqueName: \"kubernetes.io/projected/188bdbf9-2ed2-427b-99c1-7c435a25a3c6-kube-api-access-l68vw\") pod \"nova-cell0-conductor-0\" (UID: \"188bdbf9-2ed2-427b-99c1-7c435a25a3c6\") " pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:12 crc kubenswrapper[4743]: I0122 14:06:12.773566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:13 crc kubenswrapper[4743]: W0122 14:06:13.211993 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod188bdbf9_2ed2_427b_99c1_7c435a25a3c6.slice/crio-b53a3cac8bd241a193ad6678b6dd689734780b659facb2237f748d4e68737ee6 WatchSource:0}: Error finding container b53a3cac8bd241a193ad6678b6dd689734780b659facb2237f748d4e68737ee6: Status 404 returned error can't find the container with id b53a3cac8bd241a193ad6678b6dd689734780b659facb2237f748d4e68737ee6 Jan 22 14:06:13 crc kubenswrapper[4743]: I0122 14:06:13.214611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 22 14:06:13 crc kubenswrapper[4743]: I0122 14:06:13.330479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"188bdbf9-2ed2-427b-99c1-7c435a25a3c6","Type":"ContainerStarted","Data":"b53a3cac8bd241a193ad6678b6dd689734780b659facb2237f748d4e68737ee6"} Jan 22 14:06:15 crc kubenswrapper[4743]: I0122 14:06:15.347163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerStarted","Data":"f846f6de528d60f980ec7c1de86c17d87a4fbf947c2258b0014fe2b82465d57c"} Jan 22 14:06:15 crc kubenswrapper[4743]: I0122 14:06:15.349115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"188bdbf9-2ed2-427b-99c1-7c435a25a3c6","Type":"ContainerStarted","Data":"69bc0417137466bc6be4c400434072d6b0fbf034f9c0d48bc3391bb71577e065"} Jan 22 14:06:15 crc kubenswrapper[4743]: I0122 14:06:15.349318 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:15 crc kubenswrapper[4743]: I0122 14:06:15.366687 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.317222509 podStartE2EDuration="8.366659834s" podCreationTimestamp="2026-01-22 14:06:07 +0000 UTC" firstStartedPulling="2026-01-22 14:06:08.442085425 +0000 UTC m=+1204.997128588" lastFinishedPulling="2026-01-22 14:06:14.49152275 +0000 UTC m=+1211.046565913" observedRunningTime="2026-01-22 14:06:15.363828171 +0000 UTC m=+1211.918871354" watchObservedRunningTime="2026-01-22 14:06:15.366659834 +0000 UTC m=+1211.921703037" Jan 22 14:06:15 crc kubenswrapper[4743]: I0122 14:06:15.389194 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.389175084 podStartE2EDuration="3.389175084s" podCreationTimestamp="2026-01-22 14:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:15.38241784 +0000 UTC m=+1211.937461023" watchObservedRunningTime="2026-01-22 14:06:15.389175084 +0000 UTC m=+1211.944218267" Jan 22 14:06:16 crc kubenswrapper[4743]: I0122 14:06:16.359063 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:06:22 crc kubenswrapper[4743]: I0122 14:06:22.808576 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.345074 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9bfnc"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.346685 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.350298 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.350324 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.365990 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9bfnc"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.429236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.429287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2kh\" (UniqueName: \"kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.429420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.429465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.526060 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.527547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.535294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.535340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2kh\" (UniqueName: \"kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.535388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.535407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.543767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.564671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.565375 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.565518 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.574951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.577359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2kh\" (UniqueName: \"kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh\") pod \"nova-cell0-cell-mapping-9bfnc\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.637909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c29pq\" (UniqueName: \"kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.637977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.638008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.638098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.665287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.672501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.673178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.720309 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.740681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwhq\" (UniqueName: \"kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.741584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c29pq\" (UniqueName: \"kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.742305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.751905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.766551 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.842899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwhq\" (UniqueName: \"kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.842943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.843014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.843039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.844332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c29pq\" (UniqueName: \"kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq\") pod \"nova-api-0\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " pod="openstack/nova-api-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.846405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.848422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.849474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.849552 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.851033 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.872437 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.897755 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.905806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwhq\" (UniqueName: \"kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq\") pod \"nova-metadata-0\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " pod="openstack/nova-metadata-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.949840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.949897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.949924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpqj\" (UniqueName: \"kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:23 crc kubenswrapper[4743]: I0122 14:06:23.954210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.001275 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.006835 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.007937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.012321 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpqj\" (UniqueName: \"kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgddd\" (UniqueName: \"kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.060829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.074808 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.075316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.087528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.087593 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.089146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.099500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpqj\" (UniqueName: \"kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj\") pod \"nova-scheduler-0\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.121351 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.161941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.161991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjww\" (UniqueName: \"kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgddd\" (UniqueName: \"kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162089 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.162226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.165395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.167624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.182816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgddd\" (UniqueName: \"kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.199220 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265902 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjww\" (UniqueName: \"kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.265986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.266946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.267603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.268171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.269629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.270219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.289874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjww\" (UniqueName: \"kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww\") pod \"dnsmasq-dns-757b4f8459-5b8wb\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.400577 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.426052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.502762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9bfnc"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.674545 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.695507 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.730167 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7nqwj"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.731343 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.734579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.734669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.767325 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7nqwj"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.796365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.796422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.796508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.796623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26n2x\" (UniqueName: \"kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.860536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.903193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.903432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.903474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.903543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26n2x\" (UniqueName: \"kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.908885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.909708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.911756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:24 crc kubenswrapper[4743]: I0122 14:06:24.920337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26n2x\" (UniqueName: \"kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x\") pod \"nova-cell1-conductor-db-sync-7nqwj\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.038583 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:06:25 crc kubenswrapper[4743]: W0122 14:06:25.046366 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice/crio-ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0 WatchSource:0}: Error finding container ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0: Status 404 returned error can't find the container with id ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0 Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.098295 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:25 crc kubenswrapper[4743]: W0122 14:06:25.145489 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod768af7f0_e632_457f_bcb9_9069ae72ba02.slice/crio-7263a4747e0b6a12075a60526a9998df04b5dae363ba867b01091adf000479f3 WatchSource:0}: Error finding container 7263a4747e0b6a12075a60526a9998df04b5dae363ba867b01091adf000479f3: Status 404 returned error can't find the container with id 7263a4747e0b6a12075a60526a9998df04b5dae363ba867b01091adf000479f3 Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.145652 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.443192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9bfnc" event={"ID":"68eb078f-0a0b-4463-98e7-fb2dc396ca6f","Type":"ContainerStarted","Data":"d44713eea626495a42e8b912c7d8fd7f5c1a4b00ab4eeb4457c90bea882a7ffe"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.443510 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9bfnc" event={"ID":"68eb078f-0a0b-4463-98e7-fb2dc396ca6f","Type":"ContainerStarted","Data":"6027c7076fe2bee479323579ebd5704d9f6b43551467aadaeb0c275f7236400f"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.444610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerStarted","Data":"513ef1dabc9ef883e9b8184738a4cabcd6169e8fa128a1893e29fd8960c9de7b"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.446177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" event={"ID":"768af7f0-e632-457f-bcb9-9069ae72ba02","Type":"ContainerStarted","Data":"7263a4747e0b6a12075a60526a9998df04b5dae363ba867b01091adf000479f3"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.447189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerStarted","Data":"82255815a580bc3c077198a265c2a43ad4dc2497a6d4567449393f4ab404fc35"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.450528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84f089c9-1a96-40ce-879d-4220b824f089","Type":"ContainerStarted","Data":"ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.453393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bd751a6-2e9a-4ea9-863c-2d629f910470","Type":"ContainerStarted","Data":"db2314d5c24071b0104de08406cedaa81d51a5c9c627f036b54c5ce2ce65ea2f"} Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.462438 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9bfnc" podStartSLOduration=2.4624161940000002 podStartE2EDuration="2.462416194s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:25.456357408 +0000 UTC m=+1222.011400581" watchObservedRunningTime="2026-01-22 14:06:25.462416194 +0000 UTC m=+1222.017459357" Jan 22 14:06:25 crc kubenswrapper[4743]: I0122 14:06:25.581050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7nqwj"] Jan 22 14:06:25 crc kubenswrapper[4743]: W0122 14:06:25.582070 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9b8754_ae09_4ea6_ba23_88227365b34b.slice/crio-440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea WatchSource:0}: Error finding container 440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea: Status 404 returned error can't find the container with id 440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea Jan 22 14:06:26 crc kubenswrapper[4743]: I0122 14:06:26.462315 4743 generic.go:334] "Generic (PLEG): container finished" podID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerID="4a5a2ec66c1bda18f25ff27a30a7dae8e42739b96181b2ec47508304aaa05b7f" exitCode=0 Jan 22 14:06:26 crc kubenswrapper[4743]: I0122 14:06:26.462419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" event={"ID":"768af7f0-e632-457f-bcb9-9069ae72ba02","Type":"ContainerDied","Data":"4a5a2ec66c1bda18f25ff27a30a7dae8e42739b96181b2ec47508304aaa05b7f"} Jan 22 14:06:26 crc kubenswrapper[4743]: I0122 14:06:26.471612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" event={"ID":"ef9b8754-ae09-4ea6-ba23-88227365b34b","Type":"ContainerStarted","Data":"cf9992f13f04eeaf2a694880869e9157cce34bd8f95d73c2b9126bc213ca7068"} Jan 22 14:06:26 crc kubenswrapper[4743]: I0122 14:06:26.471652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" event={"ID":"ef9b8754-ae09-4ea6-ba23-88227365b34b","Type":"ContainerStarted","Data":"440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea"} Jan 22 14:06:26 crc kubenswrapper[4743]: I0122 14:06:26.519587 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" podStartSLOduration=2.5195686999999998 podStartE2EDuration="2.5195687s" podCreationTimestamp="2026-01-22 14:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:26.509723227 +0000 UTC m=+1223.064766390" watchObservedRunningTime="2026-01-22 14:06:26.5195687 +0000 UTC m=+1223.074611863" Jan 22 14:06:27 crc kubenswrapper[4743]: I0122 14:06:27.387926 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:06:27 crc kubenswrapper[4743]: I0122 14:06:27.401823 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:27 crc kubenswrapper[4743]: I0122 14:06:27.485131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" event={"ID":"768af7f0-e632-457f-bcb9-9069ae72ba02","Type":"ContainerStarted","Data":"7debe843a7d0015e59f09f6e39567f75374acae96b618da3460f4d7f7f2c8eca"} Jan 22 14:06:27 crc kubenswrapper[4743]: I0122 14:06:27.485461 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:27 crc kubenswrapper[4743]: I0122 14:06:27.518167 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" podStartSLOduration=4.518147647 podStartE2EDuration="4.518147647s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:27.516619208 +0000 UTC m=+1224.071662381" watchObservedRunningTime="2026-01-22 14:06:27.518147647 +0000 UTC m=+1224.073190810" Jan 22 14:06:30 crc kubenswrapper[4743]: I0122 14:06:30.048973 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:06:30 crc kubenswrapper[4743]: I0122 14:06:30.049354 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:06:32 crc kubenswrapper[4743]: I0122 14:06:32.548692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerStarted","Data":"0fb893ed23887d820e7c9eff6c99f7b86c06618433d32e8bf392eb0af3d7270d"} Jan 22 14:06:32 crc kubenswrapper[4743]: I0122 14:06:32.556948 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="84f089c9-1a96-40ce-879d-4220b824f089" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a55b1230b3baf65fa88e2160e58c0c8dc21d88aeeb22d648c9929e5e6fe548ca" gracePeriod=30 Jan 22 14:06:32 crc kubenswrapper[4743]: I0122 14:06:32.574848 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.674534072 podStartE2EDuration="9.574829369s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="2026-01-22 14:06:25.048335478 +0000 UTC m=+1221.603378641" lastFinishedPulling="2026-01-22 14:06:31.948630775 +0000 UTC m=+1228.503673938" observedRunningTime="2026-01-22 14:06:32.572374683 +0000 UTC m=+1229.127417846" watchObservedRunningTime="2026-01-22 14:06:32.574829369 +0000 UTC m=+1229.129872532" Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.568053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerStarted","Data":"5bed4977cf605118098b2393db87eb616d93697f6207324a38cc3c1cdbe12edd"} Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.568358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerStarted","Data":"890fdb4ca049069c43bfc89b28fff135fb6f76ded8834015f665aeb79f0d1d20"} Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.571003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84f089c9-1a96-40ce-879d-4220b824f089","Type":"ContainerStarted","Data":"a55b1230b3baf65fa88e2160e58c0c8dc21d88aeeb22d648c9929e5e6fe548ca"} Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.573347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bd751a6-2e9a-4ea9-863c-2d629f910470","Type":"ContainerStarted","Data":"52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a"} Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.576232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerStarted","Data":"ca59b4fcf51e3c37f5f773c2c49aa2c856d7a70c57827f7bfd7e884c70eff983"} Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.576337 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-log" containerID="cri-o://0fb893ed23887d820e7c9eff6c99f7b86c06618433d32e8bf392eb0af3d7270d" gracePeriod=30 Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.576673 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-metadata" containerID="cri-o://ca59b4fcf51e3c37f5f773c2c49aa2c856d7a70c57827f7bfd7e884c70eff983" gracePeriod=30 Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.601661 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.377278852 podStartE2EDuration="10.601634714s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="2026-01-22 14:06:24.724328544 +0000 UTC m=+1221.279371707" lastFinishedPulling="2026-01-22 14:06:31.948684416 +0000 UTC m=+1228.503727569" observedRunningTime="2026-01-22 14:06:33.59330422 +0000 UTC m=+1230.148347383" watchObservedRunningTime="2026-01-22 14:06:33.601634714 +0000 UTC m=+1230.156677887" Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.617637 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.502240321 podStartE2EDuration="10.617619064s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="2026-01-22 14:06:24.834246818 +0000 UTC m=+1221.389289981" lastFinishedPulling="2026-01-22 14:06:31.949625561 +0000 UTC m=+1228.504668724" observedRunningTime="2026-01-22 14:06:33.613542375 +0000 UTC m=+1230.168585538" watchObservedRunningTime="2026-01-22 14:06:33.617619064 +0000 UTC m=+1230.172662227" Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.642534 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.43123127 podStartE2EDuration="10.642517254s" podCreationTimestamp="2026-01-22 14:06:23 +0000 UTC" firstStartedPulling="2026-01-22 14:06:24.737230027 +0000 UTC m=+1221.292273180" lastFinishedPulling="2026-01-22 14:06:31.948516001 +0000 UTC m=+1228.503559164" observedRunningTime="2026-01-22 14:06:33.637097399 +0000 UTC m=+1230.192140582" watchObservedRunningTime="2026-01-22 14:06:33.642517254 +0000 UTC m=+1230.197560417" Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.955261 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:06:33 crc kubenswrapper[4743]: I0122 14:06:33.955371 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.003047 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.003126 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.200417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.200489 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.249214 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.402054 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.427129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.524915 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.525493 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="dnsmasq-dns" containerID="cri-o://f683aef5f31dd4851d8dbb75c202dfbe1a7f3740660a38626bac089b531cd6b6" gracePeriod=10 Jan 22 14:06:34 crc kubenswrapper[4743]: I0122 14:06:34.618597 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 22 14:06:35 crc kubenswrapper[4743]: I0122 14:06:35.038041 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:35 crc kubenswrapper[4743]: I0122 14:06:35.038094 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:35 crc kubenswrapper[4743]: I0122 14:06:35.593983 4743 generic.go:334] "Generic (PLEG): container finished" podID="f16088c1-ebed-477f-a0d9-8499a083b248" containerID="0fb893ed23887d820e7c9eff6c99f7b86c06618433d32e8bf392eb0af3d7270d" exitCode=143 Jan 22 14:06:35 crc kubenswrapper[4743]: I0122 14:06:35.594109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerDied","Data":"0fb893ed23887d820e7c9eff6c99f7b86c06618433d32e8bf392eb0af3d7270d"} Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.344480 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.608204 4743 generic.go:334] "Generic (PLEG): container finished" podID="f16088c1-ebed-477f-a0d9-8499a083b248" containerID="ca59b4fcf51e3c37f5f773c2c49aa2c856d7a70c57827f7bfd7e884c70eff983" exitCode=0 Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.608263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerDied","Data":"ca59b4fcf51e3c37f5f773c2c49aa2c856d7a70c57827f7bfd7e884c70eff983"} Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.610200 4743 generic.go:334] "Generic (PLEG): container finished" podID="68eb078f-0a0b-4463-98e7-fb2dc396ca6f" containerID="d44713eea626495a42e8b912c7d8fd7f5c1a4b00ab4eeb4457c90bea882a7ffe" exitCode=0 Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.610266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9bfnc" event={"ID":"68eb078f-0a0b-4463-98e7-fb2dc396ca6f","Type":"ContainerDied","Data":"d44713eea626495a42e8b912c7d8fd7f5c1a4b00ab4eeb4457c90bea882a7ffe"} Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.614088 4743 generic.go:334] "Generic (PLEG): container finished" podID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerID="f683aef5f31dd4851d8dbb75c202dfbe1a7f3740660a38626bac089b531cd6b6" exitCode=0 Jan 22 14:06:36 crc kubenswrapper[4743]: I0122 14:06:36.614180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" event={"ID":"081eea2f-bf2e-435b-bdfe-61b2311d7e10","Type":"ContainerDied","Data":"f683aef5f31dd4851d8dbb75c202dfbe1a7f3740660a38626bac089b531cd6b6"} Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.110583 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.213968 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.245281 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300205 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts\") pod \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data\") pod \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300399 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300446 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle\") pod \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300474 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfhq9\" (UniqueName: \"kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2kh\" (UniqueName: \"kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh\") pod \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\" (UID: \"68eb078f-0a0b-4463-98e7-fb2dc396ca6f\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.300646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb\") pod \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\" (UID: \"081eea2f-bf2e-435b-bdfe-61b2311d7e10\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.311636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh" (OuterVolumeSpecName: "kube-api-access-vh2kh") pod "68eb078f-0a0b-4463-98e7-fb2dc396ca6f" (UID: "68eb078f-0a0b-4463-98e7-fb2dc396ca6f"). InnerVolumeSpecName "kube-api-access-vh2kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.318975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9" (OuterVolumeSpecName: "kube-api-access-bfhq9") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "kube-api-access-bfhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.319017 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts" (OuterVolumeSpecName: "scripts") pod "68eb078f-0a0b-4463-98e7-fb2dc396ca6f" (UID: "68eb078f-0a0b-4463-98e7-fb2dc396ca6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.339432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data" (OuterVolumeSpecName: "config-data") pod "68eb078f-0a0b-4463-98e7-fb2dc396ca6f" (UID: "68eb078f-0a0b-4463-98e7-fb2dc396ca6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.349289 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.380043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config" (OuterVolumeSpecName: "config") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.380915 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.384016 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68eb078f-0a0b-4463-98e7-fb2dc396ca6f" (UID: "68eb078f-0a0b-4463-98e7-fb2dc396ca6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.389825 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.403955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkwhq\" (UniqueName: \"kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq\") pod \"f16088c1-ebed-477f-a0d9-8499a083b248\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.404078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle\") pod \"f16088c1-ebed-477f-a0d9-8499a083b248\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.404368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data\") pod \"f16088c1-ebed-477f-a0d9-8499a083b248\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.404527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs\") pod \"f16088c1-ebed-477f-a0d9-8499a083b248\" (UID: \"f16088c1-ebed-477f-a0d9-8499a083b248\") " Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405545 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405568 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405588 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405600 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405611 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405626 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405637 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfhq9\" (UniqueName: \"kubernetes.io/projected/081eea2f-bf2e-435b-bdfe-61b2311d7e10-kube-api-access-bfhq9\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.405652 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2kh\" (UniqueName: \"kubernetes.io/projected/68eb078f-0a0b-4463-98e7-fb2dc396ca6f-kube-api-access-vh2kh\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.406037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs" (OuterVolumeSpecName: "logs") pod "f16088c1-ebed-477f-a0d9-8499a083b248" (UID: "f16088c1-ebed-477f-a0d9-8499a083b248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.407125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.411694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq" (OuterVolumeSpecName: "kube-api-access-xkwhq") pod "f16088c1-ebed-477f-a0d9-8499a083b248" (UID: "f16088c1-ebed-477f-a0d9-8499a083b248"). InnerVolumeSpecName "kube-api-access-xkwhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.428294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "081eea2f-bf2e-435b-bdfe-61b2311d7e10" (UID: "081eea2f-bf2e-435b-bdfe-61b2311d7e10"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.434732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data" (OuterVolumeSpecName: "config-data") pod "f16088c1-ebed-477f-a0d9-8499a083b248" (UID: "f16088c1-ebed-477f-a0d9-8499a083b248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.440169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f16088c1-ebed-477f-a0d9-8499a083b248" (UID: "f16088c1-ebed-477f-a0d9-8499a083b248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.508528 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.509058 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.509165 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f16088c1-ebed-477f-a0d9-8499a083b248-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.509232 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/081eea2f-bf2e-435b-bdfe-61b2311d7e10-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.509295 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkwhq\" (UniqueName: \"kubernetes.io/projected/f16088c1-ebed-477f-a0d9-8499a083b248-kube-api-access-xkwhq\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.509356 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16088c1-ebed-477f-a0d9-8499a083b248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.631690 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.632314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f16088c1-ebed-477f-a0d9-8499a083b248","Type":"ContainerDied","Data":"513ef1dabc9ef883e9b8184738a4cabcd6169e8fa128a1893e29fd8960c9de7b"} Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.632840 4743 scope.go:117] "RemoveContainer" containerID="ca59b4fcf51e3c37f5f773c2c49aa2c856d7a70c57827f7bfd7e884c70eff983" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.633345 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9bfnc" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.633341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9bfnc" event={"ID":"68eb078f-0a0b-4463-98e7-fb2dc396ca6f","Type":"ContainerDied","Data":"6027c7076fe2bee479323579ebd5704d9f6b43551467aadaeb0c275f7236400f"} Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.633499 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6027c7076fe2bee479323579ebd5704d9f6b43551467aadaeb0c275f7236400f" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.635637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.635585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wr2mc" event={"ID":"081eea2f-bf2e-435b-bdfe-61b2311d7e10","Type":"ContainerDied","Data":"cc18024b6e88e0a162afbfe4112fbaf16573a81a5939b62989207470a8e066c2"} Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.675203 4743 scope.go:117] "RemoveContainer" containerID="0fb893ed23887d820e7c9eff6c99f7b86c06618433d32e8bf392eb0af3d7270d" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.701553 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.717367 4743 scope.go:117] "RemoveContainer" containerID="f683aef5f31dd4851d8dbb75c202dfbe1a7f3740660a38626bac089b531cd6b6" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.722902 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wr2mc"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.740933 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.752146 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763417 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.763755 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68eb078f-0a0b-4463-98e7-fb2dc396ca6f" containerName="nova-manage" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763766 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="68eb078f-0a0b-4463-98e7-fb2dc396ca6f" containerName="nova-manage" Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.763808 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="init" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763815 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="init" Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.763858 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-metadata" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763866 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-metadata" Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.763876 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-log" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763883 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-log" Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.763900 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="dnsmasq-dns" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.763906 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="dnsmasq-dns" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.764216 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-log" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.764233 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="68eb078f-0a0b-4463-98e7-fb2dc396ca6f" containerName="nova-manage" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.764248 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" containerName="dnsmasq-dns" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.764259 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" containerName="nova-metadata-metadata" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.765464 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.769465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.769666 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.776320 4743 scope.go:117] "RemoveContainer" containerID="baaf16cd914646de7851d96e105c3d8faf6de71f0175e51a881efbd69e037df1" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.788910 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.844953 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.845437 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-log" containerID="cri-o://5bed4977cf605118098b2393db87eb616d93697f6207324a38cc3c1cdbe12edd" gracePeriod=30 Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.845868 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-api" containerID="cri-o://890fdb4ca049069c43bfc89b28fff135fb6f76ded8834015f665aeb79f0d1d20" gracePeriod=30 Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.865057 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.865274 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerName="nova-scheduler-scheduler" containerID="cri-o://52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" gracePeriod=30 Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.876185 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:38 crc kubenswrapper[4743]: E0122 14:06:38.876859 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-wpcwd logs nova-metadata-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-metadata-0" podUID="8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.916365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.916477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.916519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.916611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:38 crc kubenswrapper[4743]: I0122 14:06:38.916692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcwd\" (UniqueName: \"kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.018651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.019056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.019245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.019677 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.019920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcwd\" (UniqueName: \"kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.020378 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.023291 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.023759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.024526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.047480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcwd\" (UniqueName: \"kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd\") pod \"nova-metadata-0\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: E0122 14:06:39.202130 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:06:39 crc kubenswrapper[4743]: E0122 14:06:39.203626 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:06:39 crc kubenswrapper[4743]: E0122 14:06:39.206814 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:06:39 crc kubenswrapper[4743]: E0122 14:06:39.206976 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerName="nova-scheduler-scheduler" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.645890 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerID="5bed4977cf605118098b2393db87eb616d93697f6207324a38cc3c1cdbe12edd" exitCode=143 Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.645993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerDied","Data":"5bed4977cf605118098b2393db87eb616d93697f6207324a38cc3c1cdbe12edd"} Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.648246 4743 generic.go:334] "Generic (PLEG): container finished" podID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerID="52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" exitCode=0 Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.648369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bd751a6-2e9a-4ea9-863c-2d629f910470","Type":"ContainerDied","Data":"52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a"} Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.650690 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.667012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.731207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle\") pod \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.731360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs\") pod \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.731411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs\") pod \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.731444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpcwd\" (UniqueName: \"kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd\") pod \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.731498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data\") pod \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\" (UID: \"8c09b5ab-ecaa-4aa9-bfdf-055218f4814c\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.732164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs" (OuterVolumeSpecName: "logs") pod "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" (UID: "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.732509 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.737022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" (UID: "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.738105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data" (OuterVolumeSpecName: "config-data") pod "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" (UID: "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.738326 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" (UID: "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.758162 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd" (OuterVolumeSpecName: "kube-api-access-wpcwd") pod "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" (UID: "8c09b5ab-ecaa-4aa9-bfdf-055218f4814c"). InnerVolumeSpecName "kube-api-access-wpcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.759963 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081eea2f-bf2e-435b-bdfe-61b2311d7e10" path="/var/lib/kubelet/pods/081eea2f-bf2e-435b-bdfe-61b2311d7e10/volumes" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.760583 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16088c1-ebed-477f-a0d9-8499a083b248" path="/var/lib/kubelet/pods/f16088c1-ebed-477f-a0d9-8499a083b248/volumes" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.778759 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.833987 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kpqj\" (UniqueName: \"kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj\") pod \"1bd751a6-2e9a-4ea9-863c-2d629f910470\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data\") pod \"1bd751a6-2e9a-4ea9-863c-2d629f910470\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle\") pod \"1bd751a6-2e9a-4ea9-863c-2d629f910470\" (UID: \"1bd751a6-2e9a-4ea9-863c-2d629f910470\") " Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834768 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834805 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834822 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpcwd\" (UniqueName: \"kubernetes.io/projected/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-kube-api-access-wpcwd\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.834834 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.837702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj" (OuterVolumeSpecName: "kube-api-access-5kpqj") pod "1bd751a6-2e9a-4ea9-863c-2d629f910470" (UID: "1bd751a6-2e9a-4ea9-863c-2d629f910470"). InnerVolumeSpecName "kube-api-access-5kpqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.860907 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data" (OuterVolumeSpecName: "config-data") pod "1bd751a6-2e9a-4ea9-863c-2d629f910470" (UID: "1bd751a6-2e9a-4ea9-863c-2d629f910470"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.862923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd751a6-2e9a-4ea9-863c-2d629f910470" (UID: "1bd751a6-2e9a-4ea9-863c-2d629f910470"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.936596 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.936636 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd751a6-2e9a-4ea9-863c-2d629f910470-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:39 crc kubenswrapper[4743]: I0122 14:06:39.936650 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kpqj\" (UniqueName: \"kubernetes.io/projected/1bd751a6-2e9a-4ea9-863c-2d629f910470-kube-api-access-5kpqj\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.660035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.660046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bd751a6-2e9a-4ea9-863c-2d629f910470","Type":"ContainerDied","Data":"db2314d5c24071b0104de08406cedaa81d51a5c9c627f036b54c5ce2ce65ea2f"} Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.660079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.660106 4743 scope.go:117] "RemoveContainer" containerID="52dcde200e88e923583bd4619df131bdbed6b5207a016b7316bfa9fbe0f9e68a" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.781205 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.814213 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.829377 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.838031 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.845310 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: E0122 14:06:40.846162 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerName="nova-scheduler-scheduler" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.846190 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerName="nova-scheduler-scheduler" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.846382 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" containerName="nova-scheduler-scheduler" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.847597 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.851851 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.851955 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.861075 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.870994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.874093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.874751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.886323 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzsgs\" (UniqueName: \"kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxffb\" (UniqueName: \"kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:40 crc kubenswrapper[4743]: I0122 14:06:40.963611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzsgs\" (UniqueName: \"kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065747 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxffb\" (UniqueName: \"kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.065839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.066178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.071849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.071862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.072761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.074027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.076612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.094897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzsgs\" (UniqueName: \"kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs\") pod \"nova-metadata-0\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.094957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxffb\" (UniqueName: \"kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb\") pod \"nova-scheduler-0\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.169352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.195156 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.702186 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:06:41 crc kubenswrapper[4743]: W0122 14:06:41.702578 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95a3397b_b84b_4519_8543_3aac6cb34f49.slice/crio-41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7 WatchSource:0}: Error finding container 41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7: Status 404 returned error can't find the container with id 41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7 Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.781730 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd751a6-2e9a-4ea9-863c-2d629f910470" path="/var/lib/kubelet/pods/1bd751a6-2e9a-4ea9-863c-2d629f910470/volumes" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.782468 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c09b5ab-ecaa-4aa9-bfdf-055218f4814c" path="/var/lib/kubelet/pods/8c09b5ab-ecaa-4aa9-bfdf-055218f4814c/volumes" Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.783114 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.925331 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:41 crc kubenswrapper[4743]: I0122 14:06:41.925945 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" containerName="kube-state-metrics" containerID="cri-o://8a9277a97a83e9bcc0250a45662bd79e2e3456b6000758c0cfd4824f6e60342a" gracePeriod=30 Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.689399 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerID="890fdb4ca049069c43bfc89b28fff135fb6f76ded8834015f665aeb79f0d1d20" exitCode=0 Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.689760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerDied","Data":"890fdb4ca049069c43bfc89b28fff135fb6f76ded8834015f665aeb79f0d1d20"} Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.693561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerStarted","Data":"665e71d5edb78c15d1596cc807737958fd8aea3fd58c96068ef8d8676d4c990c"} Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.696867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95a3397b-b84b-4519-8543-3aac6cb34f49","Type":"ContainerStarted","Data":"41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7"} Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.700779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4bc6739c-92bc-4cee-b3ae-5e178073cf0f","Type":"ContainerDied","Data":"8a9277a97a83e9bcc0250a45662bd79e2e3456b6000758c0cfd4824f6e60342a"} Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.700690 4743 generic.go:334] "Generic (PLEG): container finished" podID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" containerID="8a9277a97a83e9bcc0250a45662bd79e2e3456b6000758c0cfd4824f6e60342a" exitCode=2 Jan 22 14:06:42 crc kubenswrapper[4743]: I0122 14:06:42.950291 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.001350 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xrkp\" (UniqueName: \"kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp\") pod \"4bc6739c-92bc-4cee-b3ae-5e178073cf0f\" (UID: \"4bc6739c-92bc-4cee-b3ae-5e178073cf0f\") " Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.007036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp" (OuterVolumeSpecName: "kube-api-access-2xrkp") pod "4bc6739c-92bc-4cee-b3ae-5e178073cf0f" (UID: "4bc6739c-92bc-4cee-b3ae-5e178073cf0f"). InnerVolumeSpecName "kube-api-access-2xrkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.104169 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xrkp\" (UniqueName: \"kubernetes.io/projected/4bc6739c-92bc-4cee-b3ae-5e178073cf0f-kube-api-access-2xrkp\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.314381 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.408269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs\") pod \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.408323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data\") pod \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.408437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle\") pod \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.408541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c29pq\" (UniqueName: \"kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq\") pod \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\" (UID: \"1e82d42e-0c6d-4ff4-a53e-171f14a28c90\") " Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.422577 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq" (OuterVolumeSpecName: "kube-api-access-c29pq") pod "1e82d42e-0c6d-4ff4-a53e-171f14a28c90" (UID: "1e82d42e-0c6d-4ff4-a53e-171f14a28c90"). InnerVolumeSpecName "kube-api-access-c29pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.440087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data" (OuterVolumeSpecName: "config-data") pod "1e82d42e-0c6d-4ff4-a53e-171f14a28c90" (UID: "1e82d42e-0c6d-4ff4-a53e-171f14a28c90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.446207 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e82d42e-0c6d-4ff4-a53e-171f14a28c90" (UID: "1e82d42e-0c6d-4ff4-a53e-171f14a28c90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.511401 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.511437 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c29pq\" (UniqueName: \"kubernetes.io/projected/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-kube-api-access-c29pq\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.511448 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.523867 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs" (OuterVolumeSpecName: "logs") pod "1e82d42e-0c6d-4ff4-a53e-171f14a28c90" (UID: "1e82d42e-0c6d-4ff4-a53e-171f14a28c90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.613434 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e82d42e-0c6d-4ff4-a53e-171f14a28c90-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.709393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerStarted","Data":"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59"} Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.709432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerStarted","Data":"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2"} Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.710465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95a3397b-b84b-4519-8543-3aac6cb34f49","Type":"ContainerStarted","Data":"137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c"} Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.711710 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.711715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4bc6739c-92bc-4cee-b3ae-5e178073cf0f","Type":"ContainerDied","Data":"487f496ebf7e65f179617d35c1c2b6dc13e1ce12dc23046a7640e584cb297499"} Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.711840 4743 scope.go:117] "RemoveContainer" containerID="8a9277a97a83e9bcc0250a45662bd79e2e3456b6000758c0cfd4824f6e60342a" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.713586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e82d42e-0c6d-4ff4-a53e-171f14a28c90","Type":"ContainerDied","Data":"82255815a580bc3c077198a265c2a43ad4dc2497a6d4567449393f4ab404fc35"} Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.713721 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.748613 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.748596362 podStartE2EDuration="3.748596362s" podCreationTimestamp="2026-01-22 14:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:43.732345 +0000 UTC m=+1240.287388163" watchObservedRunningTime="2026-01-22 14:06:43.748596362 +0000 UTC m=+1240.303639525" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.764909 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.764889425 podStartE2EDuration="3.764889425s" podCreationTimestamp="2026-01-22 14:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:43.758583743 +0000 UTC m=+1240.313626916" watchObservedRunningTime="2026-01-22 14:06:43.764889425 +0000 UTC m=+1240.319932588" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.792586 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.802883 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.812858 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.821584 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.831405 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: E0122 14:06:43.832000 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-log" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832022 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-log" Jan 22 14:06:43 crc kubenswrapper[4743]: E0122 14:06:43.832037 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" containerName="kube-state-metrics" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832046 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" containerName="kube-state-metrics" Jan 22 14:06:43 crc kubenswrapper[4743]: E0122 14:06:43.832075 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-api" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832085 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-api" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832297 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-log" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832331 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" containerName="nova-api-api" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.832346 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" containerName="kube-state-metrics" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.833115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.836093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.836137 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.853277 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.867530 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.869523 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.875075 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.879169 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6swb\" (UniqueName: \"kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprtm\" (UniqueName: \"kubernetes.io/projected/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-api-access-zprtm\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.919538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.926678 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.926967 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-central-agent" containerID="cri-o://8fbb5299db2bcf722b02a3d31129df8c08d7c2c8ede399481905d93f9e3f69dc" gracePeriod=30 Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.926983 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="proxy-httpd" containerID="cri-o://f846f6de528d60f980ec7c1de86c17d87a4fbf947c2258b0014fe2b82465d57c" gracePeriod=30 Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.927052 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="sg-core" containerID="cri-o://64e8eed1a4fa3ea055afc93b12f350be61e7bc8bc6755eab54e90b3fa299429f" gracePeriod=30 Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.927098 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-notification-agent" containerID="cri-o://2532b8c4498f157815ce844d739c6429f24622ee8533293b97bfb332b84a6f51" gracePeriod=30 Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.966608 4743 scope.go:117] "RemoveContainer" containerID="890fdb4ca049069c43bfc89b28fff135fb6f76ded8834015f665aeb79f0d1d20" Jan 22 14:06:43 crc kubenswrapper[4743]: I0122 14:06:43.996026 4743 scope.go:117] "RemoveContainer" containerID="5bed4977cf605118098b2393db87eb616d93697f6207324a38cc3c1cdbe12edd" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6swb\" (UniqueName: \"kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprtm\" (UniqueName: \"kubernetes.io/projected/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-api-access-zprtm\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.021938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.022354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.026196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.026227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.027296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.028030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.028367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.039115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6swb\" (UniqueName: \"kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb\") pod \"nova-api-0\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.041514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprtm\" (UniqueName: \"kubernetes.io/projected/492b4d6f-25ef-41b4-9aa8-876d9baaaf13-kube-api-access-zprtm\") pod \"kube-state-metrics-0\" (UID: \"492b4d6f-25ef-41b4-9aa8-876d9baaaf13\") " pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.149496 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.198559 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.689536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.726779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"492b4d6f-25ef-41b4-9aa8-876d9baaaf13","Type":"ContainerStarted","Data":"e0a778d0f82c5d12826c0ab2c8e95419f6850f4a5150587fb6055ae48422a305"} Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.729887 4743 generic.go:334] "Generic (PLEG): container finished" podID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerID="f846f6de528d60f980ec7c1de86c17d87a4fbf947c2258b0014fe2b82465d57c" exitCode=0 Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.729914 4743 generic.go:334] "Generic (PLEG): container finished" podID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerID="64e8eed1a4fa3ea055afc93b12f350be61e7bc8bc6755eab54e90b3fa299429f" exitCode=2 Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.729944 4743 generic.go:334] "Generic (PLEG): container finished" podID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerID="8fbb5299db2bcf722b02a3d31129df8c08d7c2c8ede399481905d93f9e3f69dc" exitCode=0 Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.730948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerDied","Data":"f846f6de528d60f980ec7c1de86c17d87a4fbf947c2258b0014fe2b82465d57c"} Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.731022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerDied","Data":"64e8eed1a4fa3ea055afc93b12f350be61e7bc8bc6755eab54e90b3fa299429f"} Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.731058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerDied","Data":"8fbb5299db2bcf722b02a3d31129df8c08d7c2c8ede399481905d93f9e3f69dc"} Jan 22 14:06:44 crc kubenswrapper[4743]: I0122 14:06:44.753844 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.742984 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"492b4d6f-25ef-41b4-9aa8-876d9baaaf13","Type":"ContainerStarted","Data":"350483a736a0a965831faa38a91d522cddcda48ddbba86ceb6867308a479b2c4"} Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.743555 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.745034 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerStarted","Data":"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79"} Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.745062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerStarted","Data":"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f"} Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.745074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerStarted","Data":"c4b4d1b8981e6067a8b97a7e801a16d149445601307e672e1cd4cd24424ebe1e"} Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.759905 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e82d42e-0c6d-4ff4-a53e-171f14a28c90" path="/var/lib/kubelet/pods/1e82d42e-0c6d-4ff4-a53e-171f14a28c90/volumes" Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.761349 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc6739c-92bc-4cee-b3ae-5e178073cf0f" path="/var/lib/kubelet/pods/4bc6739c-92bc-4cee-b3ae-5e178073cf0f/volumes" Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.764778 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.2635063029999998 podStartE2EDuration="2.764748713s" podCreationTimestamp="2026-01-22 14:06:43 +0000 UTC" firstStartedPulling="2026-01-22 14:06:44.692867257 +0000 UTC m=+1241.247910420" lastFinishedPulling="2026-01-22 14:06:45.194109667 +0000 UTC m=+1241.749152830" observedRunningTime="2026-01-22 14:06:45.759209192 +0000 UTC m=+1242.314252355" watchObservedRunningTime="2026-01-22 14:06:45.764748713 +0000 UTC m=+1242.319791886" Jan 22 14:06:45 crc kubenswrapper[4743]: I0122 14:06:45.813303 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.813272543 podStartE2EDuration="2.813272543s" podCreationTimestamp="2026-01-22 14:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:45.801255616 +0000 UTC m=+1242.356298799" watchObservedRunningTime="2026-01-22 14:06:45.813272543 +0000 UTC m=+1242.368315706" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.170442 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.170525 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.195328 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.757342 4743 generic.go:334] "Generic (PLEG): container finished" podID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerID="2532b8c4498f157815ce844d739c6429f24622ee8533293b97bfb332b84a6f51" exitCode=0 Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.758538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerDied","Data":"2532b8c4498f157815ce844d739c6429f24622ee8533293b97bfb332b84a6f51"} Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.893646 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gh9m\" (UniqueName: \"kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976579 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data\") pod \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\" (UID: \"0af4ea6f-e824-4b5a-925c-699d0d342d5b\") " Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.976972 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.977639 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.978634 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.982064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts" (OuterVolumeSpecName: "scripts") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:46 crc kubenswrapper[4743]: I0122 14:06:46.982582 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m" (OuterVolumeSpecName: "kube-api-access-4gh9m") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "kube-api-access-4gh9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.006955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.064054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.079026 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.079054 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.079063 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gh9m\" (UniqueName: \"kubernetes.io/projected/0af4ea6f-e824-4b5a-925c-699d0d342d5b-kube-api-access-4gh9m\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.079074 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.079084 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af4ea6f-e824-4b5a-925c-699d0d342d5b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.081150 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data" (OuterVolumeSpecName: "config-data") pod "0af4ea6f-e824-4b5a-925c-699d0d342d5b" (UID: "0af4ea6f-e824-4b5a-925c-699d0d342d5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.180423 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af4ea6f-e824-4b5a-925c-699d0d342d5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.771192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af4ea6f-e824-4b5a-925c-699d0d342d5b","Type":"ContainerDied","Data":"eb6dbb54f0f1365355e5259c86caa77e1749322f54e0c6fd41f4a1ce565fabdf"} Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.771304 4743 scope.go:117] "RemoveContainer" containerID="f846f6de528d60f980ec7c1de86c17d87a4fbf947c2258b0014fe2b82465d57c" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.771422 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.779720 4743 generic.go:334] "Generic (PLEG): container finished" podID="ef9b8754-ae09-4ea6-ba23-88227365b34b" containerID="cf9992f13f04eeaf2a694880869e9157cce34bd8f95d73c2b9126bc213ca7068" exitCode=0 Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.779845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" event={"ID":"ef9b8754-ae09-4ea6-ba23-88227365b34b","Type":"ContainerDied","Data":"cf9992f13f04eeaf2a694880869e9157cce34bd8f95d73c2b9126bc213ca7068"} Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.803177 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.831218 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.835442 4743 scope.go:117] "RemoveContainer" containerID="64e8eed1a4fa3ea055afc93b12f350be61e7bc8bc6755eab54e90b3fa299429f" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.852970 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:47 crc kubenswrapper[4743]: E0122 14:06:47.853363 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-notification-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853387 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-notification-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: E0122 14:06:47.853425 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="proxy-httpd" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853433 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="proxy-httpd" Jan 22 14:06:47 crc kubenswrapper[4743]: E0122 14:06:47.853460 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-central-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853468 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-central-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: E0122 14:06:47.853482 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="sg-core" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853489 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="sg-core" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853658 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="proxy-httpd" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853670 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-central-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853681 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="ceilometer-notification-agent" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.853697 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" containerName="sg-core" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.858782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.861637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.861863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.861866 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.867143 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.880125 4743 scope.go:117] "RemoveContainer" containerID="2532b8c4498f157815ce844d739c6429f24622ee8533293b97bfb332b84a6f51" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.895693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.895943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmjg\" (UniqueName: \"kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.896484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.905237 4743 scope.go:117] "RemoveContainer" containerID="8fbb5299db2bcf722b02a3d31129df8c08d7c2c8ede399481905d93f9e3f69dc" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998383 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:47 crc kubenswrapper[4743]: I0122 14:06:47.998646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmjg\" (UniqueName: \"kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:47.999877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:47.999936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.003119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.003984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.004389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.008536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.012238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.027261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmjg\" (UniqueName: \"kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg\") pod \"ceilometer-0\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.187413 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.664954 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:06:48 crc kubenswrapper[4743]: I0122 14:06:48.797691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerStarted","Data":"89caeccae2ac2e4f025165cd1360ccdc8aa2d1ecf626af576df836e51aa376d6"} Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.158420 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.221636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle\") pod \"ef9b8754-ae09-4ea6-ba23-88227365b34b\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.221826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26n2x\" (UniqueName: \"kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x\") pod \"ef9b8754-ae09-4ea6-ba23-88227365b34b\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.221846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data\") pod \"ef9b8754-ae09-4ea6-ba23-88227365b34b\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.221935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts\") pod \"ef9b8754-ae09-4ea6-ba23-88227365b34b\" (UID: \"ef9b8754-ae09-4ea6-ba23-88227365b34b\") " Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.227057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts" (OuterVolumeSpecName: "scripts") pod "ef9b8754-ae09-4ea6-ba23-88227365b34b" (UID: "ef9b8754-ae09-4ea6-ba23-88227365b34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.227281 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x" (OuterVolumeSpecName: "kube-api-access-26n2x") pod "ef9b8754-ae09-4ea6-ba23-88227365b34b" (UID: "ef9b8754-ae09-4ea6-ba23-88227365b34b"). InnerVolumeSpecName "kube-api-access-26n2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.247313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data" (OuterVolumeSpecName: "config-data") pod "ef9b8754-ae09-4ea6-ba23-88227365b34b" (UID: "ef9b8754-ae09-4ea6-ba23-88227365b34b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.251328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef9b8754-ae09-4ea6-ba23-88227365b34b" (UID: "ef9b8754-ae09-4ea6-ba23-88227365b34b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.326385 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.326427 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26n2x\" (UniqueName: \"kubernetes.io/projected/ef9b8754-ae09-4ea6-ba23-88227365b34b-kube-api-access-26n2x\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.326440 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.326449 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9b8754-ae09-4ea6-ba23-88227365b34b-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.758576 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af4ea6f-e824-4b5a-925c-699d0d342d5b" path="/var/lib/kubelet/pods/0af4ea6f-e824-4b5a-925c-699d0d342d5b/volumes" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.814233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerStarted","Data":"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032"} Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.817349 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" event={"ID":"ef9b8754-ae09-4ea6-ba23-88227365b34b","Type":"ContainerDied","Data":"440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea"} Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.817384 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440b5e3e209ee6f098e5cb9a5353eb7fe8384ce748afbd192de9e0d8d50878ea" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.817467 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7nqwj" Jan 22 14:06:49 crc kubenswrapper[4743]: E0122 14:06:49.887469 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9b8754_ae09_4ea6_ba23_88227365b34b.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.907803 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 22 14:06:49 crc kubenswrapper[4743]: E0122 14:06:49.908405 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9b8754-ae09-4ea6-ba23-88227365b34b" containerName="nova-cell1-conductor-db-sync" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.908422 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9b8754-ae09-4ea6-ba23-88227365b34b" containerName="nova-cell1-conductor-db-sync" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.908579 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9b8754-ae09-4ea6-ba23-88227365b34b" containerName="nova-cell1-conductor-db-sync" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.909085 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.909161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.920470 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.945890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v45dk\" (UniqueName: \"kubernetes.io/projected/3b26ff8e-b36c-47d8-8d74-da49485ec363-kube-api-access-v45dk\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.945947 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:49 crc kubenswrapper[4743]: I0122 14:06:49.945985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.050824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.050967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v45dk\" (UniqueName: \"kubernetes.io/projected/3b26ff8e-b36c-47d8-8d74-da49485ec363-kube-api-access-v45dk\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.051008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.058510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.064973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b26ff8e-b36c-47d8-8d74-da49485ec363-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.069986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v45dk\" (UniqueName: \"kubernetes.io/projected/3b26ff8e-b36c-47d8-8d74-da49485ec363-kube-api-access-v45dk\") pod \"nova-cell1-conductor-0\" (UID: \"3b26ff8e-b36c-47d8-8d74-da49485ec363\") " pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.234330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.657773 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 22 14:06:50 crc kubenswrapper[4743]: W0122 14:06:50.660669 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b26ff8e_b36c_47d8_8d74_da49485ec363.slice/crio-80e776e8f80291fe0f7fb3c6629b63fa52c34bfdc07e6e78367a1496a1be8d56 WatchSource:0}: Error finding container 80e776e8f80291fe0f7fb3c6629b63fa52c34bfdc07e6e78367a1496a1be8d56: Status 404 returned error can't find the container with id 80e776e8f80291fe0f7fb3c6629b63fa52c34bfdc07e6e78367a1496a1be8d56 Jan 22 14:06:50 crc kubenswrapper[4743]: I0122 14:06:50.827651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b26ff8e-b36c-47d8-8d74-da49485ec363","Type":"ContainerStarted","Data":"80e776e8f80291fe0f7fb3c6629b63fa52c34bfdc07e6e78367a1496a1be8d56"} Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.170469 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.170863 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.196370 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.231842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.838691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b26ff8e-b36c-47d8-8d74-da49485ec363","Type":"ContainerStarted","Data":"f5df7dedd0d9f1aa831a20f108cd906c38af5a2b2e4882cdf86da713775158fb"} Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.838853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.841668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerStarted","Data":"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2"} Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.881838 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.881804423 podStartE2EDuration="2.881804423s" podCreationTimestamp="2026-01-22 14:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:51.856865605 +0000 UTC m=+1248.411908778" watchObservedRunningTime="2026-01-22 14:06:51.881804423 +0000 UTC m=+1248.436847606" Jan 22 14:06:51 crc kubenswrapper[4743]: I0122 14:06:51.925935 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 22 14:06:52 crc kubenswrapper[4743]: I0122 14:06:52.187045 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:52 crc kubenswrapper[4743]: I0122 14:06:52.187114 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:52 crc kubenswrapper[4743]: I0122 14:06:52.868304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerStarted","Data":"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae"} Jan 22 14:06:53 crc kubenswrapper[4743]: I0122 14:06:53.877264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerStarted","Data":"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd"} Jan 22 14:06:53 crc kubenswrapper[4743]: I0122 14:06:53.877713 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:06:53 crc kubenswrapper[4743]: I0122 14:06:53.915595 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.476917641 podStartE2EDuration="6.915571484s" podCreationTimestamp="2026-01-22 14:06:47 +0000 UTC" firstStartedPulling="2026-01-22 14:06:48.663600456 +0000 UTC m=+1245.218643629" lastFinishedPulling="2026-01-22 14:06:53.102254309 +0000 UTC m=+1249.657297472" observedRunningTime="2026-01-22 14:06:53.908168413 +0000 UTC m=+1250.463211586" watchObservedRunningTime="2026-01-22 14:06:53.915571484 +0000 UTC m=+1250.470614647" Jan 22 14:06:54 crc kubenswrapper[4743]: I0122 14:06:54.164044 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 22 14:06:54 crc kubenswrapper[4743]: I0122 14:06:54.198799 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:06:54 crc kubenswrapper[4743]: I0122 14:06:54.199084 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:06:55 crc kubenswrapper[4743]: I0122 14:06:55.270630 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 22 14:06:55 crc kubenswrapper[4743]: I0122 14:06:55.281975 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:55 crc kubenswrapper[4743]: I0122 14:06:55.281983 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:00 crc kubenswrapper[4743]: I0122 14:07:00.049035 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:07:00 crc kubenswrapper[4743]: I0122 14:07:00.049726 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:07:01 crc kubenswrapper[4743]: I0122 14:07:01.176809 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 22 14:07:01 crc kubenswrapper[4743]: I0122 14:07:01.177484 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 22 14:07:01 crc kubenswrapper[4743]: I0122 14:07:01.184537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 22 14:07:01 crc kubenswrapper[4743]: I0122 14:07:01.949907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 22 14:07:02 crc kubenswrapper[4743]: I0122 14:07:02.960034 4743 generic.go:334] "Generic (PLEG): container finished" podID="84f089c9-1a96-40ce-879d-4220b824f089" containerID="a55b1230b3baf65fa88e2160e58c0c8dc21d88aeeb22d648c9929e5e6fe548ca" exitCode=137 Jan 22 14:07:02 crc kubenswrapper[4743]: I0122 14:07:02.960090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84f089c9-1a96-40ce-879d-4220b824f089","Type":"ContainerDied","Data":"a55b1230b3baf65fa88e2160e58c0c8dc21d88aeeb22d648c9929e5e6fe548ca"} Jan 22 14:07:02 crc kubenswrapper[4743]: I0122 14:07:02.960453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84f089c9-1a96-40ce-879d-4220b824f089","Type":"ContainerDied","Data":"ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0"} Jan 22 14:07:02 crc kubenswrapper[4743]: I0122 14:07:02.960522 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad782ca6661d1f3047136b10add928497dd1fb6b287c05f714de1054f62f63e0" Jan 22 14:07:02 crc kubenswrapper[4743]: I0122 14:07:02.960648 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.151527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgddd\" (UniqueName: \"kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd\") pod \"84f089c9-1a96-40ce-879d-4220b824f089\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.151589 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data\") pod \"84f089c9-1a96-40ce-879d-4220b824f089\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.151673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle\") pod \"84f089c9-1a96-40ce-879d-4220b824f089\" (UID: \"84f089c9-1a96-40ce-879d-4220b824f089\") " Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.156548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd" (OuterVolumeSpecName: "kube-api-access-lgddd") pod "84f089c9-1a96-40ce-879d-4220b824f089" (UID: "84f089c9-1a96-40ce-879d-4220b824f089"). InnerVolumeSpecName "kube-api-access-lgddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.177624 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84f089c9-1a96-40ce-879d-4220b824f089" (UID: "84f089c9-1a96-40ce-879d-4220b824f089"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.183886 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data" (OuterVolumeSpecName: "config-data") pod "84f089c9-1a96-40ce-879d-4220b824f089" (UID: "84f089c9-1a96-40ce-879d-4220b824f089"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.254458 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgddd\" (UniqueName: \"kubernetes.io/projected/84f089c9-1a96-40ce-879d-4220b824f089-kube-api-access-lgddd\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.254499 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.254510 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f089c9-1a96-40ce-879d-4220b824f089-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.970517 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:03 crc kubenswrapper[4743]: I0122 14:07:03.996011 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.004724 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.012574 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:07:04 crc kubenswrapper[4743]: E0122 14:07:04.012947 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f089c9-1a96-40ce-879d-4220b824f089" containerName="nova-cell1-novncproxy-novncproxy" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.012966 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f089c9-1a96-40ce-879d-4220b824f089" containerName="nova-cell1-novncproxy-novncproxy" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.013200 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f089c9-1a96-40ce-879d-4220b824f089" containerName="nova-cell1-novncproxy-novncproxy" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.013845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.019276 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.019444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.019493 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.028386 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.084185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.084244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.084326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.084403 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcsnp\" (UniqueName: \"kubernetes.io/projected/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-kube-api-access-rcsnp\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.084450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.186288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.186603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.186735 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.186876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcsnp\" (UniqueName: \"kubernetes.io/projected/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-kube-api-access-rcsnp\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.187007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.190470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.191418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.191451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.195783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.206081 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.206748 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.208016 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.214552 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.219175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcsnp\" (UniqueName: \"kubernetes.io/projected/41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac-kube-api-access-rcsnp\") pod \"nova-cell1-novncproxy-0\" (UID: \"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac\") " pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.348645 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:04 crc kubenswrapper[4743]: W0122 14:07:04.861850 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d94ed2_d3ad_4fbc_9ece_a7fb65eab7ac.slice/crio-669f375cb76c6db58d0eb62374fb71709d7f356ac0706434d62bb47eab257b2e WatchSource:0}: Error finding container 669f375cb76c6db58d0eb62374fb71709d7f356ac0706434d62bb47eab257b2e: Status 404 returned error can't find the container with id 669f375cb76c6db58d0eb62374fb71709d7f356ac0706434d62bb47eab257b2e Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.867395 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.979566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac","Type":"ContainerStarted","Data":"669f375cb76c6db58d0eb62374fb71709d7f356ac0706434d62bb47eab257b2e"} Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.980519 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 22 14:07:04 crc kubenswrapper[4743]: I0122 14:07:04.990161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.156422 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.158257 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.171808 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxgzs\" (UniqueName: \"kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310500 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.310583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxgzs\" (UniqueName: \"kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.412742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.413698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.413717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.413722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.413937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.413960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.433237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxgzs\" (UniqueName: \"kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs\") pod \"dnsmasq-dns-89c5cd4d5-k6jlz\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.484188 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.760888 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f089c9-1a96-40ce-879d-4220b824f089" path="/var/lib/kubelet/pods/84f089c9-1a96-40ce-879d-4220b824f089/volumes" Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.954136 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:07:05 crc kubenswrapper[4743]: I0122 14:07:05.993901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac","Type":"ContainerStarted","Data":"bb404e8f055b9330e5c7f53a5adb04a39b6eacf1a633a9d77234510556b6bb95"} Jan 22 14:07:06 crc kubenswrapper[4743]: I0122 14:07:06.000923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" event={"ID":"e5de46ef-12da-4f4c-b19a-4b713069a048","Type":"ContainerStarted","Data":"073519c788e686dc087db85d4644ce670a7dfae3bb8fb4c12448d866cf367d79"} Jan 22 14:07:06 crc kubenswrapper[4743]: I0122 14:07:06.020416 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.020391648 podStartE2EDuration="3.020391648s" podCreationTimestamp="2026-01-22 14:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:06.010077598 +0000 UTC m=+1262.565120771" watchObservedRunningTime="2026-01-22 14:07:06.020391648 +0000 UTC m=+1262.575434811" Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.010693 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerID="8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019" exitCode=0 Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.013048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" event={"ID":"e5de46ef-12da-4f4c-b19a-4b713069a048","Type":"ContainerDied","Data":"8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019"} Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.195350 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.195930 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-central-agent" containerID="cri-o://95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032" gracePeriod=30 Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.195963 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="proxy-httpd" containerID="cri-o://9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd" gracePeriod=30 Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.196015 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="sg-core" containerID="cri-o://69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae" gracePeriod=30 Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.196086 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-notification-agent" containerID="cri-o://b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2" gracePeriod=30 Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.202768 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": EOF" Jan 22 14:07:07 crc kubenswrapper[4743]: I0122 14:07:07.461358 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022649 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerID="9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd" exitCode=0 Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022689 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerID="69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae" exitCode=2 Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022700 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerID="95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032" exitCode=0 Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerDied","Data":"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd"} Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022803 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerDied","Data":"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae"} Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.022817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerDied","Data":"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032"} Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.025735 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" event={"ID":"e5de46ef-12da-4f4c-b19a-4b713069a048","Type":"ContainerStarted","Data":"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9"} Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.025959 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-log" containerID="cri-o://e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f" gracePeriod=30 Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.026053 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-api" containerID="cri-o://27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79" gracePeriod=30 Jan 22 14:07:08 crc kubenswrapper[4743]: I0122 14:07:08.059162 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" podStartSLOduration=3.059144845 podStartE2EDuration="3.059144845s" podCreationTimestamp="2026-01-22 14:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:08.051757714 +0000 UTC m=+1264.606800877" watchObservedRunningTime="2026-01-22 14:07:08.059144845 +0000 UTC m=+1264.614188008" Jan 22 14:07:09 crc kubenswrapper[4743]: I0122 14:07:09.036134 4743 generic.go:334] "Generic (PLEG): container finished" podID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerID="e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f" exitCode=143 Jan 22 14:07:09 crc kubenswrapper[4743]: I0122 14:07:09.036200 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerDied","Data":"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f"} Jan 22 14:07:09 crc kubenswrapper[4743]: I0122 14:07:09.036758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:09 crc kubenswrapper[4743]: I0122 14:07:09.349480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:10 crc kubenswrapper[4743]: E0122 14:07:10.341063 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.625359 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.791277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs\") pod \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.791365 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data\") pod \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.791391 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle\") pod \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.791539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6swb\" (UniqueName: \"kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb\") pod \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\" (UID: \"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee\") " Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.791956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs" (OuterVolumeSpecName: "logs") pod "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" (UID: "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.813423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb" (OuterVolumeSpecName: "kube-api-access-h6swb") pod "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" (UID: "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee"). InnerVolumeSpecName "kube-api-access-h6swb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.821495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" (UID: "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.823006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data" (OuterVolumeSpecName: "config-data") pod "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" (UID: "771a7aa1-9d00-4ad7-90be-b3cc3edd39ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.893841 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6swb\" (UniqueName: \"kubernetes.io/projected/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-kube-api-access-h6swb\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.893881 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.893893 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:11 crc kubenswrapper[4743]: I0122 14:07:11.893905 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.074535 4743 generic.go:334] "Generic (PLEG): container finished" podID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerID="27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79" exitCode=0 Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.074576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerDied","Data":"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79"} Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.074603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"771a7aa1-9d00-4ad7-90be-b3cc3edd39ee","Type":"ContainerDied","Data":"c4b4d1b8981e6067a8b97a7e801a16d149445601307e672e1cd4cd24424ebe1e"} Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.074622 4743 scope.go:117] "RemoveContainer" containerID="27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.074762 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.097727 4743 scope.go:117] "RemoveContainer" containerID="e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.107987 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.125778 4743 scope.go:117] "RemoveContainer" containerID="27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79" Jan 22 14:07:12 crc kubenswrapper[4743]: E0122 14:07:12.126303 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79\": container with ID starting with 27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79 not found: ID does not exist" containerID="27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.126335 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79"} err="failed to get container status \"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79\": rpc error: code = NotFound desc = could not find container \"27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79\": container with ID starting with 27b04441ed1bf32084b8cad77b1bd366f72d49d7a8e31c49ca20d6f8ede82a79 not found: ID does not exist" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.126354 4743 scope.go:117] "RemoveContainer" containerID="e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f" Jan 22 14:07:12 crc kubenswrapper[4743]: E0122 14:07:12.126613 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f\": container with ID starting with e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f not found: ID does not exist" containerID="e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.126636 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f"} err="failed to get container status \"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f\": rpc error: code = NotFound desc = could not find container \"e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f\": container with ID starting with e2f8b1962d8baaa258eb44b4676018a927e840d36112c23c0673c4dfbdfdbe4f not found: ID does not exist" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.127699 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.140474 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:12 crc kubenswrapper[4743]: E0122 14:07:12.141010 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-log" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.141036 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-log" Jan 22 14:07:12 crc kubenswrapper[4743]: E0122 14:07:12.141074 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-api" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.141083 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-api" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.141294 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-api" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.141333 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" containerName="nova-api-log" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.142575 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.148669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.150650 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.152372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.154298 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.300697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.300812 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.300939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.300989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.301056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.301091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx9th\" (UniqueName: \"kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx9th\" (UniqueName: \"kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.403843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.409153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.409930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.417475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.419721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.423394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx9th\" (UniqueName: \"kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th\") pod \"nova-api-0\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.468013 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.651339 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.714836 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.714900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.714927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715049 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmjg\" (UniqueName: \"kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715083 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715190 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd\") pod \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\" (UID: \"ebc9c4ee-71ef-4cf1-ad65-847b5427901d\") " Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.715862 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.716296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.716363 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.719519 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg" (OuterVolumeSpecName: "kube-api-access-gxmjg") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "kube-api-access-gxmjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.720941 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts" (OuterVolumeSpecName: "scripts") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.743615 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.782183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.792405 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.813858 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data" (OuterVolumeSpecName: "config-data") pod "ebc9c4ee-71ef-4cf1-ad65-847b5427901d" (UID: "ebc9c4ee-71ef-4cf1-ad65-847b5427901d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818196 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818231 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818246 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818261 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmjg\" (UniqueName: \"kubernetes.io/projected/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-kube-api-access-gxmjg\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818274 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818286 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.818298 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebc9c4ee-71ef-4cf1-ad65-847b5427901d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:12 crc kubenswrapper[4743]: I0122 14:07:12.928282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:12 crc kubenswrapper[4743]: W0122 14:07:12.930504 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b19b071_6f53_4563_b680_ead42caf7b3b.slice/crio-24c17d6c029f76f1f4d8cc23ce29fc27169a2f21d79558b2e3a92d70ba77b90a WatchSource:0}: Error finding container 24c17d6c029f76f1f4d8cc23ce29fc27169a2f21d79558b2e3a92d70ba77b90a: Status 404 returned error can't find the container with id 24c17d6c029f76f1f4d8cc23ce29fc27169a2f21d79558b2e3a92d70ba77b90a Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.089445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerStarted","Data":"fae973dc542ff70efbb81373f2a5a891fd9ec30052f8b55785736520905ed83f"} Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.089492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerStarted","Data":"24c17d6c029f76f1f4d8cc23ce29fc27169a2f21d79558b2e3a92d70ba77b90a"} Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.092261 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerID="b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2" exitCode=0 Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.092311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerDied","Data":"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2"} Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.092331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebc9c4ee-71ef-4cf1-ad65-847b5427901d","Type":"ContainerDied","Data":"89caeccae2ac2e4f025165cd1360ccdc8aa2d1ecf626af576df836e51aa376d6"} Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.092348 4743 scope.go:117] "RemoveContainer" containerID="9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.092352 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.129272 4743 scope.go:117] "RemoveContainer" containerID="69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.135041 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.163504 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.185201 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.185825 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-notification-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.185856 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-notification-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.185888 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="proxy-httpd" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.185901 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="proxy-httpd" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.185914 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="sg-core" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.185925 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="sg-core" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.185952 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-central-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.185964 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-central-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.186323 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="sg-core" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.186354 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-central-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.186382 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="ceilometer-notification-agent" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.186414 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" containerName="proxy-httpd" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.189578 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.191932 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.192054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.193014 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.193737 4743 scope.go:117] "RemoveContainer" containerID="b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.197258 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.215948 4743 scope.go:117] "RemoveContainer" containerID="95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.237262 4743 scope.go:117] "RemoveContainer" containerID="9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.237712 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd\": container with ID starting with 9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd not found: ID does not exist" containerID="9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.237776 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd"} err="failed to get container status \"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd\": rpc error: code = NotFound desc = could not find container \"9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd\": container with ID starting with 9408c25d8221d8c16c0a3766e0ae01905374b4a41e53563499e96256349ad2cd not found: ID does not exist" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.237819 4743 scope.go:117] "RemoveContainer" containerID="69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.238173 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae\": container with ID starting with 69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae not found: ID does not exist" containerID="69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.238196 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae"} err="failed to get container status \"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae\": rpc error: code = NotFound desc = could not find container \"69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae\": container with ID starting with 69e6000846416c79e6041c0a9a0c8fecf462a8b4e9ae57443e45199d6cc25dae not found: ID does not exist" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.238212 4743 scope.go:117] "RemoveContainer" containerID="b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.238456 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2\": container with ID starting with b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2 not found: ID does not exist" containerID="b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.238475 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2"} err="failed to get container status \"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2\": rpc error: code = NotFound desc = could not find container \"b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2\": container with ID starting with b3800fb0272e27c6acc669e64597a03e3ab738a113fc14a205eb2bfec57222e2 not found: ID does not exist" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.238498 4743 scope.go:117] "RemoveContainer" containerID="95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032" Jan 22 14:07:13 crc kubenswrapper[4743]: E0122 14:07:13.238765 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032\": container with ID starting with 95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032 not found: ID does not exist" containerID="95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.238902 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032"} err="failed to get container status \"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032\": rpc error: code = NotFound desc = could not find container \"95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032\": container with ID starting with 95d7bc50520d1fe3e00eb74b25917b20e1c3fb3219d62fedd3e7bbbca28cf032 not found: ID does not exist" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.326135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-log-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.326961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-config-data\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktmw\" (UniqueName: \"kubernetes.io/projected/46f75016-697b-4cac-bc9e-3e2f5e60da77-kube-api-access-cktmw\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327545 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-run-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327742 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.327984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-scripts\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-run-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-scripts\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-log-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-config-data\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktmw\" (UniqueName: \"kubernetes.io/projected/46f75016-697b-4cac-bc9e-3e2f5e60da77-kube-api-access-cktmw\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.429886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-run-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.430157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f75016-697b-4cac-bc9e-3e2f5e60da77-log-httpd\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.435308 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.436554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.438257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-scripts\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.438860 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.440426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f75016-697b-4cac-bc9e-3e2f5e60da77-config-data\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.452913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktmw\" (UniqueName: \"kubernetes.io/projected/46f75016-697b-4cac-bc9e-3e2f5e60da77-kube-api-access-cktmw\") pod \"ceilometer-0\" (UID: \"46f75016-697b-4cac-bc9e-3e2f5e60da77\") " pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.505557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.768243 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771a7aa1-9d00-4ad7-90be-b3cc3edd39ee" path="/var/lib/kubelet/pods/771a7aa1-9d00-4ad7-90be-b3cc3edd39ee/volumes" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.769302 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc9c4ee-71ef-4cf1-ad65-847b5427901d" path="/var/lib/kubelet/pods/ebc9c4ee-71ef-4cf1-ad65-847b5427901d/volumes" Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.935265 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 22 14:07:13 crc kubenswrapper[4743]: I0122 14:07:13.947496 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:07:14 crc kubenswrapper[4743]: I0122 14:07:14.104414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f75016-697b-4cac-bc9e-3e2f5e60da77","Type":"ContainerStarted","Data":"72eebf32739132d419242dffeb60a558f7575de80f5d8e46b4e16f5866f13e89"} Jan 22 14:07:14 crc kubenswrapper[4743]: I0122 14:07:14.106101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerStarted","Data":"a3f676561e1a5f03deeaa28fe557ab35ddc3b6532b220b2cba93408a0ace5a4c"} Jan 22 14:07:14 crc kubenswrapper[4743]: I0122 14:07:14.138220 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.138198273 podStartE2EDuration="2.138198273s" podCreationTimestamp="2026-01-22 14:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:14.129653721 +0000 UTC m=+1270.684696884" watchObservedRunningTime="2026-01-22 14:07:14.138198273 +0000 UTC m=+1270.693241436" Jan 22 14:07:14 crc kubenswrapper[4743]: I0122 14:07:14.348760 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:14 crc kubenswrapper[4743]: I0122 14:07:14.370853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.136784 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.296853 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dd884"] Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.298149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.300188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.300884 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.313472 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dd884"] Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.366890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.366964 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.367012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.367056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxd4t\" (UniqueName: \"kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.468837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.468883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.468909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxd4t\" (UniqueName: \"kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.469057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.473274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.473502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.477422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.485911 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.490289 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxd4t\" (UniqueName: \"kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t\") pod \"nova-cell1-cell-mapping-dd884\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.571685 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.571982 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="dnsmasq-dns" containerID="cri-o://7debe843a7d0015e59f09f6e39567f75374acae96b618da3460f4d7f7f2c8eca" gracePeriod=10 Jan 22 14:07:15 crc kubenswrapper[4743]: I0122 14:07:15.626205 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.130019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f75016-697b-4cac-bc9e-3e2f5e60da77","Type":"ContainerStarted","Data":"7b5725d602f7f0feebc2a33ced0dda406c4238733f2735e3ea2538bb8d2cd6a0"} Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.132453 4743 generic.go:334] "Generic (PLEG): container finished" podID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerID="7debe843a7d0015e59f09f6e39567f75374acae96b618da3460f4d7f7f2c8eca" exitCode=0 Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.133010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" event={"ID":"768af7f0-e632-457f-bcb9-9069ae72ba02","Type":"ContainerDied","Data":"7debe843a7d0015e59f09f6e39567f75374acae96b618da3460f4d7f7f2c8eca"} Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.152159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dd884"] Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.686066 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821774 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.821815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpjww\" (UniqueName: \"kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww\") pod \"768af7f0-e632-457f-bcb9-9069ae72ba02\" (UID: \"768af7f0-e632-457f-bcb9-9069ae72ba02\") " Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.829984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww" (OuterVolumeSpecName: "kube-api-access-fpjww") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "kube-api-access-fpjww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.888409 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.895030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.901420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config" (OuterVolumeSpecName: "config") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.911500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.924594 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.924626 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.924636 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.924645 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.924653 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpjww\" (UniqueName: \"kubernetes.io/projected/768af7f0-e632-457f-bcb9-9069ae72ba02-kube-api-access-fpjww\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:16 crc kubenswrapper[4743]: I0122 14:07:16.930312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "768af7f0-e632-457f-bcb9-9069ae72ba02" (UID: "768af7f0-e632-457f-bcb9-9069ae72ba02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.027698 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768af7f0-e632-457f-bcb9-9069ae72ba02-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.147254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dd884" event={"ID":"a4f715db-df3f-479c-82b4-0e8bdea14dba","Type":"ContainerStarted","Data":"7af5309d332d6c839b3ad6b684db43b57e15dd2e7b57e4ac92b7a54aca9c658d"} Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.147306 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dd884" event={"ID":"a4f715db-df3f-479c-82b4-0e8bdea14dba","Type":"ContainerStarted","Data":"5d4c980be6f667419a54eda50e3f1cd6f2bd92f524e50553d2e8ce914d656555"} Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.151106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f75016-697b-4cac-bc9e-3e2f5e60da77","Type":"ContainerStarted","Data":"b9316c40952e550c6660d00f903c5132141d6a0ea8171b7078a1236c6be92492"} Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.154640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" event={"ID":"768af7f0-e632-457f-bcb9-9069ae72ba02","Type":"ContainerDied","Data":"7263a4747e0b6a12075a60526a9998df04b5dae363ba867b01091adf000479f3"} Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.154705 4743 scope.go:117] "RemoveContainer" containerID="7debe843a7d0015e59f09f6e39567f75374acae96b618da3460f4d7f7f2c8eca" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.154755 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5b8wb" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.173027 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dd884" podStartSLOduration=2.173005473 podStartE2EDuration="2.173005473s" podCreationTimestamp="2026-01-22 14:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:17.169025835 +0000 UTC m=+1273.724069008" watchObservedRunningTime="2026-01-22 14:07:17.173005473 +0000 UTC m=+1273.728048636" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.204221 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.212019 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5b8wb"] Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.422902 4743 scope.go:117] "RemoveContainer" containerID="4a5a2ec66c1bda18f25ff27a30a7dae8e42739b96181b2ec47508304aaa05b7f" Jan 22 14:07:17 crc kubenswrapper[4743]: I0122 14:07:17.794751 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" path="/var/lib/kubelet/pods/768af7f0-e632-457f-bcb9-9069ae72ba02/volumes" Jan 22 14:07:19 crc kubenswrapper[4743]: I0122 14:07:19.189843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f75016-697b-4cac-bc9e-3e2f5e60da77","Type":"ContainerStarted","Data":"e4fca9e270f3fbdfdf9e7eedb945d9e493d55510fb342e3becb9d3381bb9be41"} Jan 22 14:07:20 crc kubenswrapper[4743]: I0122 14:07:20.199775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f75016-697b-4cac-bc9e-3e2f5e60da77","Type":"ContainerStarted","Data":"4e98ec3ea0c464380b3870c84d8506696aea2e53d3d5cb7686f309b986ad5266"} Jan 22 14:07:20 crc kubenswrapper[4743]: I0122 14:07:20.200356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 22 14:07:20 crc kubenswrapper[4743]: I0122 14:07:20.232155 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.663702681 podStartE2EDuration="7.232126984s" podCreationTimestamp="2026-01-22 14:07:13 +0000 UTC" firstStartedPulling="2026-01-22 14:07:13.947312683 +0000 UTC m=+1270.502355846" lastFinishedPulling="2026-01-22 14:07:19.515736986 +0000 UTC m=+1276.070780149" observedRunningTime="2026-01-22 14:07:20.222270466 +0000 UTC m=+1276.777313629" watchObservedRunningTime="2026-01-22 14:07:20.232126984 +0000 UTC m=+1276.787170157" Jan 22 14:07:20 crc kubenswrapper[4743]: E0122 14:07:20.629493 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:07:22 crc kubenswrapper[4743]: I0122 14:07:22.217837 4743 generic.go:334] "Generic (PLEG): container finished" podID="a4f715db-df3f-479c-82b4-0e8bdea14dba" containerID="7af5309d332d6c839b3ad6b684db43b57e15dd2e7b57e4ac92b7a54aca9c658d" exitCode=0 Jan 22 14:07:22 crc kubenswrapper[4743]: I0122 14:07:22.218003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dd884" event={"ID":"a4f715db-df3f-479c-82b4-0e8bdea14dba","Type":"ContainerDied","Data":"7af5309d332d6c839b3ad6b684db43b57e15dd2e7b57e4ac92b7a54aca9c658d"} Jan 22 14:07:22 crc kubenswrapper[4743]: I0122 14:07:22.470267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:07:22 crc kubenswrapper[4743]: I0122 14:07:22.470307 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.486007 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.486570 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.614240 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.779555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxd4t\" (UniqueName: \"kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t\") pod \"a4f715db-df3f-479c-82b4-0e8bdea14dba\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.779654 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle\") pod \"a4f715db-df3f-479c-82b4-0e8bdea14dba\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.779778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts\") pod \"a4f715db-df3f-479c-82b4-0e8bdea14dba\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.780056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data\") pod \"a4f715db-df3f-479c-82b4-0e8bdea14dba\" (UID: \"a4f715db-df3f-479c-82b4-0e8bdea14dba\") " Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.786438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t" (OuterVolumeSpecName: "kube-api-access-sxd4t") pod "a4f715db-df3f-479c-82b4-0e8bdea14dba" (UID: "a4f715db-df3f-479c-82b4-0e8bdea14dba"). InnerVolumeSpecName "kube-api-access-sxd4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.788398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts" (OuterVolumeSpecName: "scripts") pod "a4f715db-df3f-479c-82b4-0e8bdea14dba" (UID: "a4f715db-df3f-479c-82b4-0e8bdea14dba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.812641 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data" (OuterVolumeSpecName: "config-data") pod "a4f715db-df3f-479c-82b4-0e8bdea14dba" (UID: "a4f715db-df3f-479c-82b4-0e8bdea14dba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.815135 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4f715db-df3f-479c-82b4-0e8bdea14dba" (UID: "a4f715db-df3f-479c-82b4-0e8bdea14dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.883887 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.884115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxd4t\" (UniqueName: \"kubernetes.io/projected/a4f715db-df3f-479c-82b4-0e8bdea14dba-kube-api-access-sxd4t\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.884145 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:23 crc kubenswrapper[4743]: I0122 14:07:23.884156 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4f715db-df3f-479c-82b4-0e8bdea14dba-scripts\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.241986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dd884" event={"ID":"a4f715db-df3f-479c-82b4-0e8bdea14dba","Type":"ContainerDied","Data":"5d4c980be6f667419a54eda50e3f1cd6f2bd92f524e50553d2e8ce914d656555"} Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.242025 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4c980be6f667419a54eda50e3f1cd6f2bd92f524e50553d2e8ce914d656555" Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.242077 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dd884" Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.464874 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.465104 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-log" containerID="cri-o://fae973dc542ff70efbb81373f2a5a891fd9ec30052f8b55785736520905ed83f" gracePeriod=30 Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.465464 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-api" containerID="cri-o://a3f676561e1a5f03deeaa28fe557ab35ddc3b6532b220b2cba93408a0ace5a4c" gracePeriod=30 Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.479270 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.479458 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerName="nova-scheduler-scheduler" containerID="cri-o://137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" gracePeriod=30 Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.519767 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.520037 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" containerID="cri-o://62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2" gracePeriod=30 Jan 22 14:07:24 crc kubenswrapper[4743]: I0122 14:07:24.520121 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" containerID="cri-o://18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59" gracePeriod=30 Jan 22 14:07:25 crc kubenswrapper[4743]: I0122 14:07:25.254005 4743 generic.go:334] "Generic (PLEG): container finished" podID="6523260a-d41b-43ec-a358-316d51466edd" containerID="62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2" exitCode=143 Jan 22 14:07:25 crc kubenswrapper[4743]: I0122 14:07:25.254179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerDied","Data":"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2"} Jan 22 14:07:25 crc kubenswrapper[4743]: I0122 14:07:25.256046 4743 generic.go:334] "Generic (PLEG): container finished" podID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerID="fae973dc542ff70efbb81373f2a5a891fd9ec30052f8b55785736520905ed83f" exitCode=143 Jan 22 14:07:25 crc kubenswrapper[4743]: I0122 14:07:25.256073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerDied","Data":"fae973dc542ff70efbb81373f2a5a891fd9ec30052f8b55785736520905ed83f"} Jan 22 14:07:26 crc kubenswrapper[4743]: E0122 14:07:26.197361 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:07:26 crc kubenswrapper[4743]: E0122 14:07:26.199969 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:07:26 crc kubenswrapper[4743]: E0122 14:07:26.201220 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 22 14:07:26 crc kubenswrapper[4743]: E0122 14:07:26.201258 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerName="nova-scheduler-scheduler" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.274784 4743 generic.go:334] "Generic (PLEG): container finished" podID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerID="137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" exitCode=0 Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.274977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95a3397b-b84b-4519-8543-3aac6cb34f49","Type":"ContainerDied","Data":"137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c"} Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.275146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95a3397b-b84b-4519-8543-3aac6cb34f49","Type":"ContainerDied","Data":"41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7"} Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.275164 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41328f8c87c6d361aa2d91ac9220f802aab3cacaba2a1e50c5247b445e8051c7" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.303130 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.461282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxffb\" (UniqueName: \"kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb\") pod \"95a3397b-b84b-4519-8543-3aac6cb34f49\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.461615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data\") pod \"95a3397b-b84b-4519-8543-3aac6cb34f49\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.461724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle\") pod \"95a3397b-b84b-4519-8543-3aac6cb34f49\" (UID: \"95a3397b-b84b-4519-8543-3aac6cb34f49\") " Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.492982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb" (OuterVolumeSpecName: "kube-api-access-qxffb") pod "95a3397b-b84b-4519-8543-3aac6cb34f49" (UID: "95a3397b-b84b-4519-8543-3aac6cb34f49"). InnerVolumeSpecName "kube-api-access-qxffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.495981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data" (OuterVolumeSpecName: "config-data") pod "95a3397b-b84b-4519-8543-3aac6cb34f49" (UID: "95a3397b-b84b-4519-8543-3aac6cb34f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.502497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a3397b-b84b-4519-8543-3aac6cb34f49" (UID: "95a3397b-b84b-4519-8543-3aac6cb34f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.564540 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.564578 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxffb\" (UniqueName: \"kubernetes.io/projected/95a3397b-b84b-4519-8543-3aac6cb34f49-kube-api-access-qxffb\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.564589 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a3397b-b84b-4519-8543-3aac6cb34f49-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.667223 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:52950->10.217.0.197:8775: read: connection reset by peer" Jan 22 14:07:27 crc kubenswrapper[4743]: I0122 14:07:27.667267 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:52966->10.217.0.197:8775: read: connection reset by peer" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.111302 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.277733 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs\") pod \"6523260a-d41b-43ec-a358-316d51466edd\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.278479 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs\") pod \"6523260a-d41b-43ec-a358-316d51466edd\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.278525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle\") pod \"6523260a-d41b-43ec-a358-316d51466edd\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.278641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzsgs\" (UniqueName: \"kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs\") pod \"6523260a-d41b-43ec-a358-316d51466edd\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.278712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data\") pod \"6523260a-d41b-43ec-a358-316d51466edd\" (UID: \"6523260a-d41b-43ec-a358-316d51466edd\") " Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.279017 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs" (OuterVolumeSpecName: "logs") pod "6523260a-d41b-43ec-a358-316d51466edd" (UID: "6523260a-d41b-43ec-a358-316d51466edd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.279422 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6523260a-d41b-43ec-a358-316d51466edd-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.292481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs" (OuterVolumeSpecName: "kube-api-access-qzsgs") pod "6523260a-d41b-43ec-a358-316d51466edd" (UID: "6523260a-d41b-43ec-a358-316d51466edd"). InnerVolumeSpecName "kube-api-access-qzsgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295561 4743 generic.go:334] "Generic (PLEG): container finished" podID="6523260a-d41b-43ec-a358-316d51466edd" containerID="18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59" exitCode=0 Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295651 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295669 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerDied","Data":"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59"} Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6523260a-d41b-43ec-a358-316d51466edd","Type":"ContainerDied","Data":"665e71d5edb78c15d1596cc807737958fd8aea3fd58c96068ef8d8676d4c990c"} Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.295778 4743 scope.go:117] "RemoveContainer" containerID="18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.304294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data" (OuterVolumeSpecName: "config-data") pod "6523260a-d41b-43ec-a358-316d51466edd" (UID: "6523260a-d41b-43ec-a358-316d51466edd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.322069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6523260a-d41b-43ec-a358-316d51466edd" (UID: "6523260a-d41b-43ec-a358-316d51466edd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.336629 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.338301 4743 scope.go:117] "RemoveContainer" containerID="62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.343889 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.354514 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355054 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="init" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355081 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="init" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355108 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355130 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerName="nova-scheduler-scheduler" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355138 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerName="nova-scheduler-scheduler" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355150 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f715db-df3f-479c-82b4-0e8bdea14dba" containerName="nova-manage" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355159 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f715db-df3f-479c-82b4-0e8bdea14dba" containerName="nova-manage" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355176 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355184 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.355214 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="dnsmasq-dns" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355224 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="dnsmasq-dns" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355456 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f715db-df3f-479c-82b4-0e8bdea14dba" containerName="nova-manage" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355472 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="768af7f0-e632-457f-bcb9-9069ae72ba02" containerName="dnsmasq-dns" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355489 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" containerName="nova-scheduler-scheduler" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355499 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-metadata" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.355514 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6523260a-d41b-43ec-a358-316d51466edd" containerName="nova-metadata-log" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.356380 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.358864 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.367177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6523260a-d41b-43ec-a358-316d51466edd" (UID: "6523260a-d41b-43ec-a358-316d51466edd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.371831 4743 scope.go:117] "RemoveContainer" containerID="18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.372689 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59\": container with ID starting with 18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59 not found: ID does not exist" containerID="18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.372740 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59"} err="failed to get container status \"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59\": rpc error: code = NotFound desc = could not find container \"18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59\": container with ID starting with 18b3a6676633c2c186136314fec35bea50959c2e066a0b5269034fab138a0c59 not found: ID does not exist" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.372780 4743 scope.go:117] "RemoveContainer" containerID="62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2" Jan 22 14:07:28 crc kubenswrapper[4743]: E0122 14:07:28.376178 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2\": container with ID starting with 62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2 not found: ID does not exist" containerID="62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.376224 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2"} err="failed to get container status \"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2\": rpc error: code = NotFound desc = could not find container \"62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2\": container with ID starting with 62ae8d536864303c0d6ab8daa42f01db669eb963bd43bd01f0d76fa75b50e1b2 not found: ID does not exist" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.384824 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.384854 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.384866 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzsgs\" (UniqueName: \"kubernetes.io/projected/6523260a-d41b-43ec-a358-316d51466edd-kube-api-access-qzsgs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.384879 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6523260a-d41b-43ec-a358-316d51466edd-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.386922 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.486767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghgwp\" (UniqueName: \"kubernetes.io/projected/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-kube-api-access-ghgwp\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.486844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.486897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-config-data\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.589043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghgwp\" (UniqueName: \"kubernetes.io/projected/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-kube-api-access-ghgwp\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.589117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.589192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-config-data\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.593886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.594407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-config-data\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.607524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghgwp\" (UniqueName: \"kubernetes.io/projected/529c10d9-fb76-4b45-8b08-3d9656bfdcd5-kube-api-access-ghgwp\") pod \"nova-scheduler-0\" (UID: \"529c10d9-fb76-4b45-8b08-3d9656bfdcd5\") " pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.686550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.708406 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.716532 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.742360 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.744480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.754679 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.758103 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 22 14:07:28 crc kubenswrapper[4743]: I0122 14:07:28.758425 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.895521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-config-data\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.895662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.895741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.895873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vsx\" (UniqueName: \"kubernetes.io/projected/fc42f0d6-9224-404d-8584-2c0fec4f3edd-kube-api-access-j4vsx\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.895909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc42f0d6-9224-404d-8584-2c0fec4f3edd-logs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.998206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.998434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vsx\" (UniqueName: \"kubernetes.io/projected/fc42f0d6-9224-404d-8584-2c0fec4f3edd-kube-api-access-j4vsx\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.998491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc42f0d6-9224-404d-8584-2c0fec4f3edd-logs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.998567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-config-data\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.998668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:28.999257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc42f0d6-9224-404d-8584-2c0fec4f3edd-logs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.004013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.004752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-config-data\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.006312 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc42f0d6-9224-404d-8584-2c0fec4f3edd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.022717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vsx\" (UniqueName: \"kubernetes.io/projected/fc42f0d6-9224-404d-8584-2c0fec4f3edd-kube-api-access-j4vsx\") pod \"nova-metadata-0\" (UID: \"fc42f0d6-9224-404d-8584-2c0fec4f3edd\") " pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.113148 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.334365 4743 generic.go:334] "Generic (PLEG): container finished" podID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerID="a3f676561e1a5f03deeaa28fe557ab35ddc3b6532b220b2cba93408a0ace5a4c" exitCode=0 Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.334457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerDied","Data":"a3f676561e1a5f03deeaa28fe557ab35ddc3b6532b220b2cba93408a0ace5a4c"} Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.671999 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.744797 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.744962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: W0122 14:07:29.764821 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod529c10d9_fb76_4b45_8b08_3d9656bfdcd5.slice/crio-1e904d978294c7e2b2c07ca5b10cf32644693f0c87331a62f616c86e1fb93c57 WatchSource:0}: Error finding container 1e904d978294c7e2b2c07ca5b10cf32644693f0c87331a62f616c86e1fb93c57: Status 404 returned error can't find the container with id 1e904d978294c7e2b2c07ca5b10cf32644693f0c87331a62f616c86e1fb93c57 Jan 22 14:07:29 crc kubenswrapper[4743]: W0122 14:07:29.768769 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc42f0d6_9224_404d_8584_2c0fec4f3edd.slice/crio-98097eb6dabe63beafae537f292a72205efa78bb5a6b183eb5221e55863b45ec WatchSource:0}: Error finding container 98097eb6dabe63beafae537f292a72205efa78bb5a6b183eb5221e55863b45ec: Status 404 returned error can't find the container with id 98097eb6dabe63beafae537f292a72205efa78bb5a6b183eb5221e55863b45ec Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.771208 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6523260a-d41b-43ec-a358-316d51466edd" path="/var/lib/kubelet/pods/6523260a-d41b-43ec-a358-316d51466edd/volumes" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.771767 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a3397b-b84b-4519-8543-3aac6cb34f49" path="/var/lib/kubelet/pods/95a3397b-b84b-4519-8543-3aac6cb34f49/volumes" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.802262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.807175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.847650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx9th\" (UniqueName: \"kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.847757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.847816 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.847871 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data\") pod \"5b19b071-6f53-4563-b680-ead42caf7b3b\" (UID: \"5b19b071-6f53-4563-b680-ead42caf7b3b\") " Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.848296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs" (OuterVolumeSpecName: "logs") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.851058 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b19b071-6f53-4563-b680-ead42caf7b3b-logs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.851095 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.851110 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.851151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th" (OuterVolumeSpecName: "kube-api-access-hx9th") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "kube-api-access-hx9th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.861157 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.861207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.873254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data" (OuterVolumeSpecName: "config-data") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.875412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b19b071-6f53-4563-b680-ead42caf7b3b" (UID: "5b19b071-6f53-4563-b680-ead42caf7b3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.952673 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx9th\" (UniqueName: \"kubernetes.io/projected/5b19b071-6f53-4563-b680-ead42caf7b3b-kube-api-access-hx9th\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.952706 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:29 crc kubenswrapper[4743]: I0122 14:07:29.952715 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b19b071-6f53-4563-b680-ead42caf7b3b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.049323 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.049377 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.049419 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.050122 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.050169 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253" gracePeriod=600 Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.345375 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253" exitCode=0 Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.345677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.345815 4743 scope.go:117] "RemoveContainer" containerID="2da3d4972818f6459ed6dbf589006b8dd9ab9ee647f4c241b04d0ac146476324" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.347818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc42f0d6-9224-404d-8584-2c0fec4f3edd","Type":"ContainerStarted","Data":"2f3883d137e5771c20912ac7c314a797a0f3289e6be62d67ac5136ed3cce4e14"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.347849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc42f0d6-9224-404d-8584-2c0fec4f3edd","Type":"ContainerStarted","Data":"98097eb6dabe63beafae537f292a72205efa78bb5a6b183eb5221e55863b45ec"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.349366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b19b071-6f53-4563-b680-ead42caf7b3b","Type":"ContainerDied","Data":"24c17d6c029f76f1f4d8cc23ce29fc27169a2f21d79558b2e3a92d70ba77b90a"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.349440 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.352745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"529c10d9-fb76-4b45-8b08-3d9656bfdcd5","Type":"ContainerStarted","Data":"04fdaf831621030778ad76654b4e30b89349890e8dd141d93d8e92af883409d2"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.352867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"529c10d9-fb76-4b45-8b08-3d9656bfdcd5","Type":"ContainerStarted","Data":"1e904d978294c7e2b2c07ca5b10cf32644693f0c87331a62f616c86e1fb93c57"} Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.376083 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.3760613409999998 podStartE2EDuration="2.376061341s" podCreationTimestamp="2026-01-22 14:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:30.370348886 +0000 UTC m=+1286.925392069" watchObservedRunningTime="2026-01-22 14:07:30.376061341 +0000 UTC m=+1286.931104514" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.386830 4743 scope.go:117] "RemoveContainer" containerID="a3f676561e1a5f03deeaa28fe557ab35ddc3b6532b220b2cba93408a0ace5a4c" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.394682 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.408780 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.421012 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:30 crc kubenswrapper[4743]: E0122 14:07:30.421467 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-log" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.421483 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-log" Jan 22 14:07:30 crc kubenswrapper[4743]: E0122 14:07:30.421514 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-api" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.421521 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-api" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.421748 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-log" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.421757 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" containerName="nova-api-api" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.422877 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.424981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.425417 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.425575 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.432236 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.445717 4743 scope.go:117] "RemoveContainer" containerID="fae973dc542ff70efbb81373f2a5a891fd9ec30052f8b55785736520905ed83f" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.564617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-config-data\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.564711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.564761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-public-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.565066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbxlc\" (UniqueName: \"kubernetes.io/projected/5e08ea55-209f-4956-b9cb-c261280252ad-kube-api-access-tbxlc\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.565196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e08ea55-209f-4956-b9cb-c261280252ad-logs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.565233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-config-data\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-public-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbxlc\" (UniqueName: \"kubernetes.io/projected/5e08ea55-209f-4956-b9cb-c261280252ad-kube-api-access-tbxlc\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.666764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e08ea55-209f-4956-b9cb-c261280252ad-logs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.667154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e08ea55-209f-4956-b9cb-c261280252ad-logs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.671976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.674260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.675132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-config-data\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.681829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e08ea55-209f-4956-b9cb-c261280252ad-public-tls-certs\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.683976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbxlc\" (UniqueName: \"kubernetes.io/projected/5e08ea55-209f-4956-b9cb-c261280252ad-kube-api-access-tbxlc\") pod \"nova-api-0\" (UID: \"5e08ea55-209f-4956-b9cb-c261280252ad\") " pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: I0122 14:07:30.744640 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 22 14:07:30 crc kubenswrapper[4743]: E0122 14:07:30.863483 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.176703 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.361106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc42f0d6-9224-404d-8584-2c0fec4f3edd","Type":"ContainerStarted","Data":"62ed775a9ce2429eac357dbac206b12ede1bcba836f15e23c4fb434fa9a74cdb"} Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.363679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14"} Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.366043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e08ea55-209f-4956-b9cb-c261280252ad","Type":"ContainerStarted","Data":"ca2fb926b9afb7999a298cdf0d79e56f210c64a704c114caab2acd7cb7e378c1"} Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.366105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e08ea55-209f-4956-b9cb-c261280252ad","Type":"ContainerStarted","Data":"de716a3e7d33064f0b99acf6725aee9e947f0a9ef47cf842b8290f0b14121d0f"} Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.388999 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.388978073 podStartE2EDuration="3.388978073s" podCreationTimestamp="2026-01-22 14:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:31.378409825 +0000 UTC m=+1287.933452988" watchObservedRunningTime="2026-01-22 14:07:31.388978073 +0000 UTC m=+1287.944021236" Jan 22 14:07:31 crc kubenswrapper[4743]: I0122 14:07:31.757037 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b19b071-6f53-4563-b680-ead42caf7b3b" path="/var/lib/kubelet/pods/5b19b071-6f53-4563-b680-ead42caf7b3b/volumes" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.234707 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.237278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.249174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.383566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e08ea55-209f-4956-b9cb-c261280252ad","Type":"ContainerStarted","Data":"f8291f980069025e9548209b23f11d6619d5efa77d6eeadb9e0528153c6aa61a"} Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.412339 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkmt\" (UniqueName: \"kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.412499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.412682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.418855 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.418487937 podStartE2EDuration="2.418487937s" podCreationTimestamp="2026-01-22 14:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:32.404882387 +0000 UTC m=+1288.959925560" watchObservedRunningTime="2026-01-22 14:07:32.418487937 +0000 UTC m=+1288.973531100" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.514919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkmt\" (UniqueName: \"kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.515577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.516431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.516738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.517567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.534880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkmt\" (UniqueName: \"kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt\") pod \"redhat-operators-gpbsv\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:32 crc kubenswrapper[4743]: I0122 14:07:32.574882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:33 crc kubenswrapper[4743]: I0122 14:07:33.074685 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:33 crc kubenswrapper[4743]: I0122 14:07:33.394748 4743 generic.go:334] "Generic (PLEG): container finished" podID="60368914-6922-43a0-8a71-384eabf32a78" containerID="a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1" exitCode=0 Jan 22 14:07:33 crc kubenswrapper[4743]: I0122 14:07:33.394893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerDied","Data":"a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1"} Jan 22 14:07:33 crc kubenswrapper[4743]: I0122 14:07:33.394941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerStarted","Data":"926bb3597743b5a27ebdb2664b2aebbd818f7dddc46e66a10c510192d4e51108"} Jan 22 14:07:33 crc kubenswrapper[4743]: I0122 14:07:33.687735 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 22 14:07:34 crc kubenswrapper[4743]: I0122 14:07:34.113769 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:07:34 crc kubenswrapper[4743]: I0122 14:07:34.114037 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 22 14:07:35 crc kubenswrapper[4743]: I0122 14:07:35.413721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerStarted","Data":"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b"} Jan 22 14:07:37 crc kubenswrapper[4743]: I0122 14:07:37.434530 4743 generic.go:334] "Generic (PLEG): container finished" podID="60368914-6922-43a0-8a71-384eabf32a78" containerID="415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b" exitCode=0 Jan 22 14:07:37 crc kubenswrapper[4743]: I0122 14:07:37.434623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerDied","Data":"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b"} Jan 22 14:07:38 crc kubenswrapper[4743]: I0122 14:07:38.687607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 22 14:07:38 crc kubenswrapper[4743]: I0122 14:07:38.715160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 22 14:07:39 crc kubenswrapper[4743]: I0122 14:07:39.114130 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 22 14:07:39 crc kubenswrapper[4743]: I0122 14:07:39.115914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 22 14:07:39 crc kubenswrapper[4743]: I0122 14:07:39.463707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerStarted","Data":"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b"} Jan 22 14:07:39 crc kubenswrapper[4743]: I0122 14:07:39.527646 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gpbsv" podStartSLOduration=2.303965185 podStartE2EDuration="7.527611572s" podCreationTimestamp="2026-01-22 14:07:32 +0000 UTC" firstStartedPulling="2026-01-22 14:07:33.396382007 +0000 UTC m=+1289.951425170" lastFinishedPulling="2026-01-22 14:07:38.620028394 +0000 UTC m=+1295.175071557" observedRunningTime="2026-01-22 14:07:39.494514212 +0000 UTC m=+1296.049557415" watchObservedRunningTime="2026-01-22 14:07:39.527611572 +0000 UTC m=+1296.082654755" Jan 22 14:07:39 crc kubenswrapper[4743]: I0122 14:07:39.549357 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 22 14:07:40 crc kubenswrapper[4743]: I0122 14:07:40.131065 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fc42f0d6-9224-404d-8584-2c0fec4f3edd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:40 crc kubenswrapper[4743]: I0122 14:07:40.131099 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fc42f0d6-9224-404d-8584-2c0fec4f3edd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:40 crc kubenswrapper[4743]: I0122 14:07:40.745388 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:07:40 crc kubenswrapper[4743]: I0122 14:07:40.745459 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 22 14:07:41 crc kubenswrapper[4743]: E0122 14:07:41.065661 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:07:41 crc kubenswrapper[4743]: I0122 14:07:41.757993 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5e08ea55-209f-4956-b9cb-c261280252ad" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:41 crc kubenswrapper[4743]: I0122 14:07:41.757997 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5e08ea55-209f-4956-b9cb-c261280252ad" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 22 14:07:42 crc kubenswrapper[4743]: I0122 14:07:42.575364 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:42 crc kubenswrapper[4743]: I0122 14:07:42.575422 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:43 crc kubenswrapper[4743]: I0122 14:07:43.521033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 22 14:07:43 crc kubenswrapper[4743]: I0122 14:07:43.624247 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gpbsv" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="registry-server" probeResult="failure" output=< Jan 22 14:07:43 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 14:07:43 crc kubenswrapper[4743]: > Jan 22 14:07:49 crc kubenswrapper[4743]: I0122 14:07:49.119710 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 22 14:07:49 crc kubenswrapper[4743]: I0122 14:07:49.120433 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 22 14:07:49 crc kubenswrapper[4743]: I0122 14:07:49.128554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 22 14:07:49 crc kubenswrapper[4743]: I0122 14:07:49.130009 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 22 14:07:50 crc kubenswrapper[4743]: I0122 14:07:50.752236 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 22 14:07:50 crc kubenswrapper[4743]: I0122 14:07:50.753661 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 22 14:07:50 crc kubenswrapper[4743]: I0122 14:07:50.755751 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 22 14:07:50 crc kubenswrapper[4743]: I0122 14:07:50.766847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 22 14:07:51 crc kubenswrapper[4743]: E0122 14:07:51.289172 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:07:51 crc kubenswrapper[4743]: I0122 14:07:51.588136 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 22 14:07:51 crc kubenswrapper[4743]: I0122 14:07:51.600181 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 22 14:07:52 crc kubenswrapper[4743]: I0122 14:07:52.647320 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:52 crc kubenswrapper[4743]: I0122 14:07:52.702845 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:52 crc kubenswrapper[4743]: I0122 14:07:52.893593 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:54 crc kubenswrapper[4743]: I0122 14:07:54.614137 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gpbsv" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="registry-server" containerID="cri-o://bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b" gracePeriod=2 Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.059429 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.140820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfkmt\" (UniqueName: \"kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt\") pod \"60368914-6922-43a0-8a71-384eabf32a78\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.140971 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content\") pod \"60368914-6922-43a0-8a71-384eabf32a78\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.140998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities\") pod \"60368914-6922-43a0-8a71-384eabf32a78\" (UID: \"60368914-6922-43a0-8a71-384eabf32a78\") " Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.142131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities" (OuterVolumeSpecName: "utilities") pod "60368914-6922-43a0-8a71-384eabf32a78" (UID: "60368914-6922-43a0-8a71-384eabf32a78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.149121 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt" (OuterVolumeSpecName: "kube-api-access-pfkmt") pod "60368914-6922-43a0-8a71-384eabf32a78" (UID: "60368914-6922-43a0-8a71-384eabf32a78"). InnerVolumeSpecName "kube-api-access-pfkmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.243854 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfkmt\" (UniqueName: \"kubernetes.io/projected/60368914-6922-43a0-8a71-384eabf32a78-kube-api-access-pfkmt\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.243907 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.254101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60368914-6922-43a0-8a71-384eabf32a78" (UID: "60368914-6922-43a0-8a71-384eabf32a78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.346117 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60368914-6922-43a0-8a71-384eabf32a78-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.623367 4743 generic.go:334] "Generic (PLEG): container finished" podID="60368914-6922-43a0-8a71-384eabf32a78" containerID="bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b" exitCode=0 Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.623413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerDied","Data":"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b"} Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.623419 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpbsv" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.623443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpbsv" event={"ID":"60368914-6922-43a0-8a71-384eabf32a78","Type":"ContainerDied","Data":"926bb3597743b5a27ebdb2664b2aebbd818f7dddc46e66a10c510192d4e51108"} Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.623462 4743 scope.go:117] "RemoveContainer" containerID="bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.660255 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.667362 4743 scope.go:117] "RemoveContainer" containerID="415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.669529 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gpbsv"] Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.690235 4743 scope.go:117] "RemoveContainer" containerID="a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.729307 4743 scope.go:117] "RemoveContainer" containerID="bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b" Jan 22 14:07:55 crc kubenswrapper[4743]: E0122 14:07:55.730177 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b\": container with ID starting with bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b not found: ID does not exist" containerID="bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.730229 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b"} err="failed to get container status \"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b\": rpc error: code = NotFound desc = could not find container \"bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b\": container with ID starting with bdaade2e500533975899843609cee1462ed1af6868c251c3013efb5aa68e699b not found: ID does not exist" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.730256 4743 scope.go:117] "RemoveContainer" containerID="415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b" Jan 22 14:07:55 crc kubenswrapper[4743]: E0122 14:07:55.730631 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b\": container with ID starting with 415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b not found: ID does not exist" containerID="415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.730665 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b"} err="failed to get container status \"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b\": rpc error: code = NotFound desc = could not find container \"415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b\": container with ID starting with 415f75fb23f1fde57d70eadda8a60ba729f07e4d6b264077d6d3652ebdced18b not found: ID does not exist" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.730687 4743 scope.go:117] "RemoveContainer" containerID="a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1" Jan 22 14:07:55 crc kubenswrapper[4743]: E0122 14:07:55.731008 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1\": container with ID starting with a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1 not found: ID does not exist" containerID="a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.731041 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1"} err="failed to get container status \"a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1\": rpc error: code = NotFound desc = could not find container \"a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1\": container with ID starting with a036facc8fe26a746d16d529789f6dcdd9ae2bc9436e747874fa7e76a92abef1 not found: ID does not exist" Jan 22 14:07:55 crc kubenswrapper[4743]: I0122 14:07:55.779478 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60368914-6922-43a0-8a71-384eabf32a78" path="/var/lib/kubelet/pods/60368914-6922-43a0-8a71-384eabf32a78/volumes" Jan 22 14:08:01 crc kubenswrapper[4743]: I0122 14:08:01.112275 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:01 crc kubenswrapper[4743]: E0122 14:08:01.529826 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f089c9_1a96_40ce_879d_4220b824f089.slice\": RecentStats: unable to find data in memory cache]" Jan 22 14:08:02 crc kubenswrapper[4743]: I0122 14:08:02.080250 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:05 crc kubenswrapper[4743]: I0122 14:08:05.157239 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="rabbitmq" containerID="cri-o://88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850" gracePeriod=604796 Jan 22 14:08:05 crc kubenswrapper[4743]: I0122 14:08:05.842230 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="rabbitmq" containerID="cri-o://76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40" gracePeriod=604797 Jan 22 14:08:08 crc kubenswrapper[4743]: I0122 14:08:08.892210 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Jan 22 14:08:08 crc kubenswrapper[4743]: I0122 14:08:08.962997 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.691056 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.796662 4743 generic.go:334] "Generic (PLEG): container finished" podID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerID="88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850" exitCode=0 Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.796703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerDied","Data":"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850"} Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.796729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7926697b-86b3-4f82-97e1-3c0d7ae9f867","Type":"ContainerDied","Data":"1e2a93d64730a9a5f66c5f9204aa7992a43d89db9b3cbc128490070dd3487ec5"} Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.796747 4743 scope.go:117] "RemoveContainer" containerID="88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.797231 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.837001 4743 scope.go:117] "RemoveContainer" containerID="1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.864883 4743 scope.go:117] "RemoveContainer" containerID="88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850" Jan 22 14:08:11 crc kubenswrapper[4743]: E0122 14:08:11.866200 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850\": container with ID starting with 88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850 not found: ID does not exist" containerID="88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.866235 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850"} err="failed to get container status \"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850\": rpc error: code = NotFound desc = could not find container \"88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850\": container with ID starting with 88ad177103856f694614a6b9e5f6bae1c7f5ed12538013f4ace91ef50e5f1850 not found: ID does not exist" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.866254 4743 scope.go:117] "RemoveContainer" containerID="1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8" Jan 22 14:08:11 crc kubenswrapper[4743]: E0122 14:08:11.868625 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8\": container with ID starting with 1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8 not found: ID does not exist" containerID="1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.868654 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8"} err="failed to get container status \"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8\": rpc error: code = NotFound desc = could not find container \"1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8\": container with ID starting with 1394518f22ac074a6608a46a959c477744a1809c57329177f4b268b4e3de2be8 not found: ID does not exist" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877413 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r88fv\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877586 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.877635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls\") pod \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\" (UID: \"7926697b-86b3-4f82-97e1-3c0d7ae9f867\") " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.878441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.878717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.880072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.885956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info" (OuterVolumeSpecName: "pod-info") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.885955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.887272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.888004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.890279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv" (OuterVolumeSpecName: "kube-api-access-r88fv") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "kube-api-access-r88fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.915392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data" (OuterVolumeSpecName: "config-data") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.949385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf" (OuterVolumeSpecName: "server-conf") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980333 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980367 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980379 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7926697b-86b3-4f82-97e1-3c0d7ae9f867-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980400 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980410 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r88fv\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-kube-api-access-r88fv\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980422 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980429 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980438 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7926697b-86b3-4f82-97e1-3c0d7ae9f867-server-conf\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980446 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7926697b-86b3-4f82-97e1-3c0d7ae9f867-pod-info\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:11 crc kubenswrapper[4743]: I0122 14:08:11.980454 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.003094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7926697b-86b3-4f82-97e1-3c0d7ae9f867" (UID: "7926697b-86b3-4f82-97e1-3c0d7ae9f867"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.006681 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.082485 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.082519 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7926697b-86b3-4f82-97e1-3c0d7ae9f867-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.158204 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.185165 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.208725 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.209412 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209437 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.209457 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="setup-container" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="setup-container" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.209476 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="extract-content" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209484 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="extract-content" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.209501 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="extract-utilities" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209509 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="extract-utilities" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.209527 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="registry-server" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209536 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="registry-server" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209811 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60368914-6922-43a0-8a71-384eabf32a78" containerName="registry-server" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.209836 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.211888 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.214288 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.214471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g6sn2" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.214687 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.214925 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.215167 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.215358 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.216165 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.226499 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.313268 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-config-data\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/600136f3-db1d-49a2-92a8-0c03aaadc963-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389690 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-server-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/600136f3-db1d-49a2-92a8-0c03aaadc963-pod-info\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.389986 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.390017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.390035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.390160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.390184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5nv\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-kube-api-access-rp5nv\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491354 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491389 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ncr\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491492 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info\") pod \"a474b98d-9569-40f4-a3d2-f4017988678b\" (UID: \"a474b98d-9569-40f4-a3d2-f4017988678b\") " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.491994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-config-data\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/600136f3-db1d-49a2-92a8-0c03aaadc963-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-server-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/600136f3-db1d-49a2-92a8-0c03aaadc963-pod-info\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.492235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5nv\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-kube-api-access-rp5nv\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.493022 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.493539 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.494618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-server-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.497940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info" (OuterVolumeSpecName: "pod-info") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.500877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.501507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.501603 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.504266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.505680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.506289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.510122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.511272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/600136f3-db1d-49a2-92a8-0c03aaadc963-config-data\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.513653 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/600136f3-db1d-49a2-92a8-0c03aaadc963-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.515415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/600136f3-db1d-49a2-92a8-0c03aaadc963-pod-info\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.516311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.516522 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.517917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.518589 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr" (OuterVolumeSpecName: "kube-api-access-j4ncr") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "kube-api-access-j4ncr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.520624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5nv\" (UniqueName: \"kubernetes.io/projected/600136f3-db1d-49a2-92a8-0c03aaadc963-kube-api-access-rp5nv\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.537719 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data" (OuterVolumeSpecName: "config-data") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.542352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"600136f3-db1d-49a2-92a8-0c03aaadc963\") " pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.577077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf" (OuterVolumeSpecName: "server-conf") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593724 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a474b98d-9569-40f4-a3d2-f4017988678b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593759 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593769 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593777 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ncr\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-kube-api-access-j4ncr\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593798 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593809 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593817 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593824 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a474b98d-9569-40f4-a3d2-f4017988678b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593833 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a474b98d-9569-40f4-a3d2-f4017988678b-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.593860 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.614854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a474b98d-9569-40f4-a3d2-f4017988678b" (UID: "a474b98d-9569-40f4-a3d2-f4017988678b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.621455 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.695551 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a474b98d-9569-40f4-a3d2-f4017988678b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.695584 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.810297 4743 generic.go:334] "Generic (PLEG): container finished" podID="a474b98d-9569-40f4-a3d2-f4017988678b" containerID="76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40" exitCode=0 Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.810365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.810403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerDied","Data":"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40"} Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.810437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a474b98d-9569-40f4-a3d2-f4017988678b","Type":"ContainerDied","Data":"f867ff5ebaceb1b356ef718724fea1a3e2ccd7875f3c6678acac9a500fe600d3"} Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.810473 4743 scope.go:117] "RemoveContainer" containerID="76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.835277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.839022 4743 scope.go:117] "RemoveContainer" containerID="25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.863174 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.872539 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.890910 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.891324 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.891335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.891369 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="setup-container" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.891374 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="setup-container" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.891537 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" containerName="rabbitmq" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.892493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.894284 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjr2m" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.895667 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.895823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.896002 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.896143 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.896266 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.896425 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.903782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.916159 4743 scope.go:117] "RemoveContainer" containerID="76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.917312 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40\": container with ID starting with 76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40 not found: ID does not exist" containerID="76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.917365 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40"} err="failed to get container status \"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40\": rpc error: code = NotFound desc = could not find container \"76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40\": container with ID starting with 76b9e7346567553a710a187cdaad499c9f2b1a58c5fba084587f9b361cba7e40 not found: ID does not exist" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.917396 4743 scope.go:117] "RemoveContainer" containerID="25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b" Jan 22 14:08:12 crc kubenswrapper[4743]: E0122 14:08:12.921430 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b\": container with ID starting with 25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b not found: ID does not exist" containerID="25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b" Jan 22 14:08:12 crc kubenswrapper[4743]: I0122 14:08:12.921484 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b"} err="failed to get container status \"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b\": rpc error: code = NotFound desc = could not find container \"25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b\": container with ID starting with 25dd80b2b3b0b22c35f8b49855a475925da682d5848b168ae1fcc4b0bdc3e10b not found: ID does not exist" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.002924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003369 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.003538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42446198-84f4-4bee-b50c-1bb5dad2e380-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.004156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.004292 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42446198-84f4-4bee-b50c-1bb5dad2e380-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.004378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.004425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqwc\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-kube-api-access-mrqwc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42446198-84f4-4bee-b50c-1bb5dad2e380-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106170 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42446198-84f4-4bee-b50c-1bb5dad2e380-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqwc\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-kube-api-access-mrqwc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.106313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.107179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.107762 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.110452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.110922 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.110971 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42446198-84f4-4bee-b50c-1bb5dad2e380-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.112241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.113374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.113645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.114825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42446198-84f4-4bee-b50c-1bb5dad2e380-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.117629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42446198-84f4-4bee-b50c-1bb5dad2e380-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.126896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqwc\" (UniqueName: \"kubernetes.io/projected/42446198-84f4-4bee-b50c-1bb5dad2e380-kube-api-access-mrqwc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.137347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42446198-84f4-4bee-b50c-1bb5dad2e380\") " pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.248774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.391444 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.706766 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 22 14:08:13 crc kubenswrapper[4743]: W0122 14:08:13.716327 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42446198_84f4_4bee_b50c_1bb5dad2e380.slice/crio-7d5506d89b46eed052362dd9b47b7a7fe121798d762a51578e5996c0b94ad618 WatchSource:0}: Error finding container 7d5506d89b46eed052362dd9b47b7a7fe121798d762a51578e5996c0b94ad618: Status 404 returned error can't find the container with id 7d5506d89b46eed052362dd9b47b7a7fe121798d762a51578e5996c0b94ad618 Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.759212 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7926697b-86b3-4f82-97e1-3c0d7ae9f867" path="/var/lib/kubelet/pods/7926697b-86b3-4f82-97e1-3c0d7ae9f867/volumes" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.760473 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a474b98d-9569-40f4-a3d2-f4017988678b" path="/var/lib/kubelet/pods/a474b98d-9569-40f4-a3d2-f4017988678b/volumes" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.816146 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.853129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.854905 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.856442 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.878437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"600136f3-db1d-49a2-92a8-0c03aaadc963","Type":"ContainerStarted","Data":"b95b29cb858c84486133bc03ce253d11ffe0ba1ac20a250242214a155eaba215"} Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.880364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42446198-84f4-4bee-b50c-1bb5dad2e380","Type":"ContainerStarted","Data":"7d5506d89b46eed052362dd9b47b7a7fe121798d762a51578e5996c0b94ad618"} Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.939357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79sd\" (UniqueName: \"kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.939575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.939635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.939702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.939996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.940205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:13 crc kubenswrapper[4743]: I0122 14:08:13.940254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.041752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.041866 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.041887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.041940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g79sd\" (UniqueName: \"kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.041980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.042001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.042025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.042679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.042760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.042692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.043316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.043469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.043526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.057704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79sd\" (UniqueName: \"kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd\") pod \"dnsmasq-dns-79bd4cc8c9-s9nzc\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.185957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:14 crc kubenswrapper[4743]: W0122 14:08:14.633463 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc73898a_48ff_4ae9_b3e0_1da31d3f4d43.slice/crio-1223f47f3a2cbce082568d3c88b3d4b78d2d301c929a74e4be8459370dfd12c0 WatchSource:0}: Error finding container 1223f47f3a2cbce082568d3c88b3d4b78d2d301c929a74e4be8459370dfd12c0: Status 404 returned error can't find the container with id 1223f47f3a2cbce082568d3c88b3d4b78d2d301c929a74e4be8459370dfd12c0 Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.644712 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.889100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" event={"ID":"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43","Type":"ContainerStarted","Data":"1223f47f3a2cbce082568d3c88b3d4b78d2d301c929a74e4be8459370dfd12c0"} Jan 22 14:08:14 crc kubenswrapper[4743]: I0122 14:08:14.890538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"600136f3-db1d-49a2-92a8-0c03aaadc963","Type":"ContainerStarted","Data":"1e31ef8dace5342fbe5f980421a8d262243ec55d65fb6ca43cdd98f949863304"} Jan 22 14:08:15 crc kubenswrapper[4743]: I0122 14:08:15.900456 4743 generic.go:334] "Generic (PLEG): container finished" podID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerID="adeb8e674037d29315685a01d9f5a64217734299e718a49419bd1776d1c6aac2" exitCode=0 Jan 22 14:08:15 crc kubenswrapper[4743]: I0122 14:08:15.901015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" event={"ID":"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43","Type":"ContainerDied","Data":"adeb8e674037d29315685a01d9f5a64217734299e718a49419bd1776d1c6aac2"} Jan 22 14:08:15 crc kubenswrapper[4743]: I0122 14:08:15.902622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42446198-84f4-4bee-b50c-1bb5dad2e380","Type":"ContainerStarted","Data":"0e7b784158100c1bdc2afc25e5202a1e8402680b65010d54232bd23e745e1362"} Jan 22 14:08:16 crc kubenswrapper[4743]: I0122 14:08:16.915644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" event={"ID":"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43","Type":"ContainerStarted","Data":"40b1275b0d330ce9a55e7a552ae5e75caf3a38588c9dd5c3b41c53e01b703220"} Jan 22 14:08:16 crc kubenswrapper[4743]: I0122 14:08:16.916080 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:16 crc kubenswrapper[4743]: I0122 14:08:16.937691 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" podStartSLOduration=3.937673928 podStartE2EDuration="3.937673928s" podCreationTimestamp="2026-01-22 14:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:16.932623131 +0000 UTC m=+1333.487666304" watchObservedRunningTime="2026-01-22 14:08:16.937673928 +0000 UTC m=+1333.492717091" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.187679 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.235889 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.236123 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="dnsmasq-dns" containerID="cri-o://c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9" gracePeriod=10 Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.452850 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nbpk8"] Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.454559 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.478725 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nbpk8"] Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-config\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knsh6\" (UniqueName: \"kubernetes.io/projected/c52cf8e4-1ecd-4882-b076-bacb37f3569e-kube-api-access-knsh6\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.587740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-svc\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.689620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-config\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.689892 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knsh6\" (UniqueName: \"kubernetes.io/projected/c52cf8e4-1ecd-4882-b076-bacb37f3569e-kube-api-access-knsh6\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.689938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-svc\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.689985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.691751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-svc\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.691894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.691922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.691995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.692779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.693260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.694029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.694249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.694535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cf8e4-1ecd-4882-b076-bacb37f3569e-config\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.724571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knsh6\" (UniqueName: \"kubernetes.io/projected/c52cf8e4-1ecd-4882-b076-bacb37f3569e-kube-api-access-knsh6\") pod \"dnsmasq-dns-55478c4467-nbpk8\" (UID: \"c52cf8e4-1ecd-4882-b076-bacb37f3569e\") " pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.809681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.955733 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.990564 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerID="c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9" exitCode=0 Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.990615 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" event={"ID":"e5de46ef-12da-4f4c-b19a-4b713069a048","Type":"ContainerDied","Data":"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9"} Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.990647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" event={"ID":"e5de46ef-12da-4f4c-b19a-4b713069a048","Type":"ContainerDied","Data":"073519c788e686dc087db85d4644ce670a7dfae3bb8fb4c12448d866cf367d79"} Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.990672 4743 scope.go:117] "RemoveContainer" containerID="c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9" Jan 22 14:08:24 crc kubenswrapper[4743]: I0122 14:08:24.990868 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-k6jlz" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.013332 4743 scope.go:117] "RemoveContainer" containerID="8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.039931 4743 scope.go:117] "RemoveContainer" containerID="c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9" Jan 22 14:08:25 crc kubenswrapper[4743]: E0122 14:08:25.040432 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9\": container with ID starting with c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9 not found: ID does not exist" containerID="c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.040463 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9"} err="failed to get container status \"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9\": rpc error: code = NotFound desc = could not find container \"c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9\": container with ID starting with c2543fa63f31119243a81e984c51aa071546d2e5f3a03db67d15ec30a80eadc9 not found: ID does not exist" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.040483 4743 scope.go:117] "RemoveContainer" containerID="8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019" Jan 22 14:08:25 crc kubenswrapper[4743]: E0122 14:08:25.040844 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019\": container with ID starting with 8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019 not found: ID does not exist" containerID="8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.040876 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019"} err="failed to get container status \"8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019\": rpc error: code = NotFound desc = could not find container \"8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019\": container with ID starting with 8ddbd5a03184b45455d0741be7f5f1609cc82cd39f28e1a6197b659ab1dfe019 not found: ID does not exist" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxgzs\" (UniqueName: \"kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106586 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.106645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc\") pod \"e5de46ef-12da-4f4c-b19a-4b713069a048\" (UID: \"e5de46ef-12da-4f4c-b19a-4b713069a048\") " Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.110890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs" (OuterVolumeSpecName: "kube-api-access-zxgzs") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "kube-api-access-zxgzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.153520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.157045 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config" (OuterVolumeSpecName: "config") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.159062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.159887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.174619 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5de46ef-12da-4f4c-b19a-4b713069a048" (UID: "e5de46ef-12da-4f4c-b19a-4b713069a048"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210063 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210925 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210945 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxgzs\" (UniqueName: \"kubernetes.io/projected/e5de46ef-12da-4f4c-b19a-4b713069a048-kube-api-access-zxgzs\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210954 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210963 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.210973 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5de46ef-12da-4f4c-b19a-4b713069a048-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.281664 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nbpk8"] Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.339675 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.356558 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-k6jlz"] Jan 22 14:08:25 crc kubenswrapper[4743]: I0122 14:08:25.759551 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" path="/var/lib/kubelet/pods/e5de46ef-12da-4f4c-b19a-4b713069a048/volumes" Jan 22 14:08:26 crc kubenswrapper[4743]: I0122 14:08:26.007916 4743 generic.go:334] "Generic (PLEG): container finished" podID="c52cf8e4-1ecd-4882-b076-bacb37f3569e" containerID="70a02810ec76ba0548ac0fd692f3aa0dd18d90b51c485f768ec2b528fcb87bef" exitCode=0 Jan 22 14:08:26 crc kubenswrapper[4743]: I0122 14:08:26.007959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" event={"ID":"c52cf8e4-1ecd-4882-b076-bacb37f3569e","Type":"ContainerDied","Data":"70a02810ec76ba0548ac0fd692f3aa0dd18d90b51c485f768ec2b528fcb87bef"} Jan 22 14:08:26 crc kubenswrapper[4743]: I0122 14:08:26.008021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" event={"ID":"c52cf8e4-1ecd-4882-b076-bacb37f3569e","Type":"ContainerStarted","Data":"e361cd1d4f713150e2d1f636e2710fda837e7aec03fbfa4e1de02ec373c09fe4"} Jan 22 14:08:27 crc kubenswrapper[4743]: I0122 14:08:27.017463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" event={"ID":"c52cf8e4-1ecd-4882-b076-bacb37f3569e","Type":"ContainerStarted","Data":"6382bb8c5afed0257caa72dde6a20bfefa523ae8ed0d046b18eec2b77fd35393"} Jan 22 14:08:27 crc kubenswrapper[4743]: I0122 14:08:27.017963 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:27 crc kubenswrapper[4743]: I0122 14:08:27.042458 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" podStartSLOduration=3.042433579 podStartE2EDuration="3.042433579s" podCreationTimestamp="2026-01-22 14:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:27.03436739 +0000 UTC m=+1343.589410553" watchObservedRunningTime="2026-01-22 14:08:27.042433579 +0000 UTC m=+1343.597476742" Jan 22 14:08:34 crc kubenswrapper[4743]: I0122 14:08:34.811279 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-nbpk8" Jan 22 14:08:34 crc kubenswrapper[4743]: I0122 14:08:34.876635 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:34 crc kubenswrapper[4743]: I0122 14:08:34.876933 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="dnsmasq-dns" containerID="cri-o://40b1275b0d330ce9a55e7a552ae5e75caf3a38588c9dd5c3b41c53e01b703220" gracePeriod=10 Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.093398 4743 generic.go:334] "Generic (PLEG): container finished" podID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerID="40b1275b0d330ce9a55e7a552ae5e75caf3a38588c9dd5c3b41c53e01b703220" exitCode=0 Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.093440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" event={"ID":"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43","Type":"ContainerDied","Data":"40b1275b0d330ce9a55e7a552ae5e75caf3a38588c9dd5c3b41c53e01b703220"} Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.326948 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408080 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408242 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408334 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g79sd\" (UniqueName: \"kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.408514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0\") pod \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\" (UID: \"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43\") " Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.416845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd" (OuterVolumeSpecName: "kube-api-access-g79sd") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "kube-api-access-g79sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.464641 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.467732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.469000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config" (OuterVolumeSpecName: "config") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.472344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.474279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.476308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" (UID: "bc73898a-48ff-4ae9-b3e0-1da31d3f4d43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528390 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528422 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528433 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528443 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g79sd\" (UniqueName: \"kubernetes.io/projected/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-kube-api-access-g79sd\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528453 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528461 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:35 crc kubenswrapper[4743]: I0122 14:08:35.528470 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.104508 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" event={"ID":"bc73898a-48ff-4ae9-b3e0-1da31d3f4d43","Type":"ContainerDied","Data":"1223f47f3a2cbce082568d3c88b3d4b78d2d301c929a74e4be8459370dfd12c0"} Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.104636 4743 scope.go:117] "RemoveContainer" containerID="40b1275b0d330ce9a55e7a552ae5e75caf3a38588c9dd5c3b41c53e01b703220" Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.104807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s9nzc" Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.133303 4743 scope.go:117] "RemoveContainer" containerID="adeb8e674037d29315685a01d9f5a64217734299e718a49419bd1776d1c6aac2" Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.147213 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:36 crc kubenswrapper[4743]: I0122 14:08:36.155260 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s9nzc"] Jan 22 14:08:37 crc kubenswrapper[4743]: I0122 14:08:37.760095 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" path="/var/lib/kubelet/pods/bc73898a-48ff-4ae9-b3e0-1da31d3f4d43/volumes" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.640599 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2"] Jan 22 14:08:43 crc kubenswrapper[4743]: E0122 14:08:43.641259 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="init" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641272 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="init" Jan 22 14:08:43 crc kubenswrapper[4743]: E0122 14:08:43.641288 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="init" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641293 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="init" Jan 22 14:08:43 crc kubenswrapper[4743]: E0122 14:08:43.641322 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641328 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: E0122 14:08:43.641345 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641351 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641497 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5de46ef-12da-4f4c-b19a-4b713069a048" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.641523 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc73898a-48ff-4ae9-b3e0-1da31d3f4d43" containerName="dnsmasq-dns" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.642145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.644279 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.645189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.645446 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.645847 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.659818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2"] Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.787727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8f5\" (UniqueName: \"kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.788185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.788233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.789341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.891471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss8f5\" (UniqueName: \"kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.891568 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.891595 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.891736 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.898163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.898240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.899711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.916566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss8f5\" (UniqueName: \"kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:43 crc kubenswrapper[4743]: I0122 14:08:43.964426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:08:44 crc kubenswrapper[4743]: W0122 14:08:44.530570 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305fa257_7d41_4a05_ae4e_1b945894aa09.slice/crio-11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91 WatchSource:0}: Error finding container 11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91: Status 404 returned error can't find the container with id 11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91 Jan 22 14:08:44 crc kubenswrapper[4743]: I0122 14:08:44.537291 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2"] Jan 22 14:08:45 crc kubenswrapper[4743]: I0122 14:08:45.296905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" event={"ID":"305fa257-7d41-4a05-ae4e-1b945894aa09","Type":"ContainerStarted","Data":"11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91"} Jan 22 14:08:47 crc kubenswrapper[4743]: I0122 14:08:47.313479 4743 generic.go:334] "Generic (PLEG): container finished" podID="42446198-84f4-4bee-b50c-1bb5dad2e380" containerID="0e7b784158100c1bdc2afc25e5202a1e8402680b65010d54232bd23e745e1362" exitCode=0 Jan 22 14:08:47 crc kubenswrapper[4743]: I0122 14:08:47.313884 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42446198-84f4-4bee-b50c-1bb5dad2e380","Type":"ContainerDied","Data":"0e7b784158100c1bdc2afc25e5202a1e8402680b65010d54232bd23e745e1362"} Jan 22 14:08:47 crc kubenswrapper[4743]: I0122 14:08:47.317062 4743 generic.go:334] "Generic (PLEG): container finished" podID="600136f3-db1d-49a2-92a8-0c03aaadc963" containerID="1e31ef8dace5342fbe5f980421a8d262243ec55d65fb6ca43cdd98f949863304" exitCode=0 Jan 22 14:08:47 crc kubenswrapper[4743]: I0122 14:08:47.317098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"600136f3-db1d-49a2-92a8-0c03aaadc963","Type":"ContainerDied","Data":"1e31ef8dace5342fbe5f980421a8d262243ec55d65fb6ca43cdd98f949863304"} Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.369407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" event={"ID":"305fa257-7d41-4a05-ae4e-1b945894aa09","Type":"ContainerStarted","Data":"1fa6cce47b366dbe9c9e53215674a36ae680bb7423c658dd70d2b455bbb96ec9"} Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.371521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"600136f3-db1d-49a2-92a8-0c03aaadc963","Type":"ContainerStarted","Data":"0db10254dfa86916023b45da132d494805b45291aa206fc895468c98fd6bd9f5"} Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.371731 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.373775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42446198-84f4-4bee-b50c-1bb5dad2e380","Type":"ContainerStarted","Data":"f60f9d986f92bf0e3e6997cb34a8771145375c716e022f5544e3bd0adf8cc60c"} Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.374002 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.394887 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" podStartSLOduration=2.071650295 podStartE2EDuration="10.394868464s" podCreationTimestamp="2026-01-22 14:08:43 +0000 UTC" firstStartedPulling="2026-01-22 14:08:44.534224573 +0000 UTC m=+1361.089267736" lastFinishedPulling="2026-01-22 14:08:52.857442742 +0000 UTC m=+1369.412485905" observedRunningTime="2026-01-22 14:08:53.384602855 +0000 UTC m=+1369.939646028" watchObservedRunningTime="2026-01-22 14:08:53.394868464 +0000 UTC m=+1369.949911627" Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.418339 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.418322712 podStartE2EDuration="41.418322712s" podCreationTimestamp="2026-01-22 14:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:53.409914563 +0000 UTC m=+1369.964957726" watchObservedRunningTime="2026-01-22 14:08:53.418322712 +0000 UTC m=+1369.973365865" Jan 22 14:08:53 crc kubenswrapper[4743]: I0122 14:08:53.441488 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.441458801 podStartE2EDuration="41.441458801s" podCreationTimestamp="2026-01-22 14:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:53.439544069 +0000 UTC m=+1369.994587232" watchObservedRunningTime="2026-01-22 14:08:53.441458801 +0000 UTC m=+1369.996501964" Jan 22 14:09:03 crc kubenswrapper[4743]: I0122 14:09:03.254009 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 22 14:09:04 crc kubenswrapper[4743]: I0122 14:09:04.466555 4743 generic.go:334] "Generic (PLEG): container finished" podID="305fa257-7d41-4a05-ae4e-1b945894aa09" containerID="1fa6cce47b366dbe9c9e53215674a36ae680bb7423c658dd70d2b455bbb96ec9" exitCode=0 Jan 22 14:09:04 crc kubenswrapper[4743]: I0122 14:09:04.466727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" event={"ID":"305fa257-7d41-4a05-ae4e-1b945894aa09","Type":"ContainerDied","Data":"1fa6cce47b366dbe9c9e53215674a36ae680bb7423c658dd70d2b455bbb96ec9"} Jan 22 14:09:05 crc kubenswrapper[4743]: I0122 14:09:05.949824 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.062384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle\") pod \"305fa257-7d41-4a05-ae4e-1b945894aa09\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.062471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss8f5\" (UniqueName: \"kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5\") pod \"305fa257-7d41-4a05-ae4e-1b945894aa09\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.062512 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory\") pod \"305fa257-7d41-4a05-ae4e-1b945894aa09\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.062659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam\") pod \"305fa257-7d41-4a05-ae4e-1b945894aa09\" (UID: \"305fa257-7d41-4a05-ae4e-1b945894aa09\") " Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.071113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5" (OuterVolumeSpecName: "kube-api-access-ss8f5") pod "305fa257-7d41-4a05-ae4e-1b945894aa09" (UID: "305fa257-7d41-4a05-ae4e-1b945894aa09"). InnerVolumeSpecName "kube-api-access-ss8f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.071411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "305fa257-7d41-4a05-ae4e-1b945894aa09" (UID: "305fa257-7d41-4a05-ae4e-1b945894aa09"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.092718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "305fa257-7d41-4a05-ae4e-1b945894aa09" (UID: "305fa257-7d41-4a05-ae4e-1b945894aa09"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.115746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory" (OuterVolumeSpecName: "inventory") pod "305fa257-7d41-4a05-ae4e-1b945894aa09" (UID: "305fa257-7d41-4a05-ae4e-1b945894aa09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.165541 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.165578 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss8f5\" (UniqueName: \"kubernetes.io/projected/305fa257-7d41-4a05-ae4e-1b945894aa09-kube-api-access-ss8f5\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.165591 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.165599 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/305fa257-7d41-4a05-ae4e-1b945894aa09-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.491140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" event={"ID":"305fa257-7d41-4a05-ae4e-1b945894aa09","Type":"ContainerDied","Data":"11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91"} Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.491186 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11d1b423452295d70057ac6afb5d30f5e7676a2ca6d025ccf1eb5db4db1c2b91" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.491442 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.555427 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d"] Jan 22 14:09:06 crc kubenswrapper[4743]: E0122 14:09:06.556140 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305fa257-7d41-4a05-ae4e-1b945894aa09" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.556172 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="305fa257-7d41-4a05-ae4e-1b945894aa09" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.556526 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="305fa257-7d41-4a05-ae4e-1b945894aa09" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.557548 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.560211 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.560216 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.560388 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.560707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.565609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d"] Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.673627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.673721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.673779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87zzc\" (UniqueName: \"kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.775181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87zzc\" (UniqueName: \"kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.775277 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.775354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.779104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.779124 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.793134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87zzc\" (UniqueName: \"kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nr24d\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:06 crc kubenswrapper[4743]: I0122 14:09:06.882522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:07 crc kubenswrapper[4743]: I0122 14:09:07.404297 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d"] Jan 22 14:09:07 crc kubenswrapper[4743]: I0122 14:09:07.513195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" event={"ID":"92bb5b08-555d-4d1b-b105-e7cf240f190b","Type":"ContainerStarted","Data":"75a2ed12e043c010cfefad3e27f91de7abce7429f359b18ec6f1f7d356b002d8"} Jan 22 14:09:08 crc kubenswrapper[4743]: I0122 14:09:08.527703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" event={"ID":"92bb5b08-555d-4d1b-b105-e7cf240f190b","Type":"ContainerStarted","Data":"553cec335eb929b52b1a3825528a5610eb68847f16c6eaa048f43b0de71347bb"} Jan 22 14:09:08 crc kubenswrapper[4743]: I0122 14:09:08.550407 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" podStartSLOduration=2.07883165 podStartE2EDuration="2.550378022s" podCreationTimestamp="2026-01-22 14:09:06 +0000 UTC" firstStartedPulling="2026-01-22 14:09:07.42163907 +0000 UTC m=+1383.976682233" lastFinishedPulling="2026-01-22 14:09:07.893185442 +0000 UTC m=+1384.448228605" observedRunningTime="2026-01-22 14:09:08.544743969 +0000 UTC m=+1385.099787142" watchObservedRunningTime="2026-01-22 14:09:08.550378022 +0000 UTC m=+1385.105421195" Jan 22 14:09:11 crc kubenswrapper[4743]: I0122 14:09:11.566611 4743 generic.go:334] "Generic (PLEG): container finished" podID="92bb5b08-555d-4d1b-b105-e7cf240f190b" containerID="553cec335eb929b52b1a3825528a5610eb68847f16c6eaa048f43b0de71347bb" exitCode=0 Jan 22 14:09:11 crc kubenswrapper[4743]: I0122 14:09:11.566736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" event={"ID":"92bb5b08-555d-4d1b-b105-e7cf240f190b","Type":"ContainerDied","Data":"553cec335eb929b52b1a3825528a5610eb68847f16c6eaa048f43b0de71347bb"} Jan 22 14:09:12 crc kubenswrapper[4743]: I0122 14:09:12.838945 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.010965 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.112681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory\") pod \"92bb5b08-555d-4d1b-b105-e7cf240f190b\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.112741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87zzc\" (UniqueName: \"kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc\") pod \"92bb5b08-555d-4d1b-b105-e7cf240f190b\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.112802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam\") pod \"92bb5b08-555d-4d1b-b105-e7cf240f190b\" (UID: \"92bb5b08-555d-4d1b-b105-e7cf240f190b\") " Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.149398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "92bb5b08-555d-4d1b-b105-e7cf240f190b" (UID: "92bb5b08-555d-4d1b-b105-e7cf240f190b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.152695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc" (OuterVolumeSpecName: "kube-api-access-87zzc") pod "92bb5b08-555d-4d1b-b105-e7cf240f190b" (UID: "92bb5b08-555d-4d1b-b105-e7cf240f190b"). InnerVolumeSpecName "kube-api-access-87zzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.156385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory" (OuterVolumeSpecName: "inventory") pod "92bb5b08-555d-4d1b-b105-e7cf240f190b" (UID: "92bb5b08-555d-4d1b-b105-e7cf240f190b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.214935 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.215224 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87zzc\" (UniqueName: \"kubernetes.io/projected/92bb5b08-555d-4d1b-b105-e7cf240f190b-kube-api-access-87zzc\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.215235 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92bb5b08-555d-4d1b-b105-e7cf240f190b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.586642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" event={"ID":"92bb5b08-555d-4d1b-b105-e7cf240f190b","Type":"ContainerDied","Data":"75a2ed12e043c010cfefad3e27f91de7abce7429f359b18ec6f1f7d356b002d8"} Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.586672 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nr24d" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.586686 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a2ed12e043c010cfefad3e27f91de7abce7429f359b18ec6f1f7d356b002d8" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.723474 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6"] Jan 22 14:09:13 crc kubenswrapper[4743]: E0122 14:09:13.724256 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb5b08-555d-4d1b-b105-e7cf240f190b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.724296 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb5b08-555d-4d1b-b105-e7cf240f190b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.724753 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bb5b08-555d-4d1b-b105-e7cf240f190b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.725988 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.731659 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.731659 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.732451 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.732639 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.733860 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6"] Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.826094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.826137 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r48c\" (UniqueName: \"kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.826157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.826184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.929469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.929532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r48c\" (UniqueName: \"kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.929561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.929587 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.935099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.935567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.951550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:13 crc kubenswrapper[4743]: I0122 14:09:13.952303 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r48c\" (UniqueName: \"kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:14 crc kubenswrapper[4743]: I0122 14:09:14.056332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:09:14 crc kubenswrapper[4743]: W0122 14:09:14.618360 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d8b498_a76a_4549_96c2_f32877beaa30.slice/crio-cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be WatchSource:0}: Error finding container cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be: Status 404 returned error can't find the container with id cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be Jan 22 14:09:14 crc kubenswrapper[4743]: I0122 14:09:14.630728 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6"] Jan 22 14:09:15 crc kubenswrapper[4743]: I0122 14:09:15.629759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" event={"ID":"33d8b498-a76a-4549-96c2-f32877beaa30","Type":"ContainerStarted","Data":"a0901979f36fec8783443941bfbb6da623c1616ee3524e2e567d7bf07b60a997"} Jan 22 14:09:15 crc kubenswrapper[4743]: I0122 14:09:15.630332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" event={"ID":"33d8b498-a76a-4549-96c2-f32877beaa30","Type":"ContainerStarted","Data":"cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be"} Jan 22 14:09:15 crc kubenswrapper[4743]: I0122 14:09:15.658279 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" podStartSLOduration=2.22085305 podStartE2EDuration="2.658225703s" podCreationTimestamp="2026-01-22 14:09:13 +0000 UTC" firstStartedPulling="2026-01-22 14:09:14.622255614 +0000 UTC m=+1391.177298777" lastFinishedPulling="2026-01-22 14:09:15.059628257 +0000 UTC m=+1391.614671430" observedRunningTime="2026-01-22 14:09:15.644668935 +0000 UTC m=+1392.199712098" watchObservedRunningTime="2026-01-22 14:09:15.658225703 +0000 UTC m=+1392.213268876" Jan 22 14:09:32 crc kubenswrapper[4743]: I0122 14:09:32.577833 4743 scope.go:117] "RemoveContainer" containerID="99d925b658db1a220feca7152975bba687878df479c6ef533b3ce009a48a6da2" Jan 22 14:09:32 crc kubenswrapper[4743]: I0122 14:09:32.599427 4743 scope.go:117] "RemoveContainer" containerID="acbdd7bbabf7d69152e3949120befe9f3d15d60dd174301ea887d2e38915619f" Jan 22 14:10:00 crc kubenswrapper[4743]: I0122 14:10:00.048695 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:10:00 crc kubenswrapper[4743]: I0122 14:10:00.049257 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:10:30 crc kubenswrapper[4743]: I0122 14:10:30.049387 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:10:30 crc kubenswrapper[4743]: I0122 14:10:30.050037 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:10:32 crc kubenswrapper[4743]: I0122 14:10:32.710131 4743 scope.go:117] "RemoveContainer" containerID="46eea21368dd54b81c103d7cd5b77b39db2dd3d45d5f81b1cb894a0b3d6ab5ba" Jan 22 14:10:32 crc kubenswrapper[4743]: I0122 14:10:32.751353 4743 scope.go:117] "RemoveContainer" containerID="9ab45a420601c4e026d9738a82fe91d9d9e16553b71d6ce00c3644df613bbad9" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.049465 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.051180 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.051376 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.052573 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.052763 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" gracePeriod=600 Jan 22 14:11:00 crc kubenswrapper[4743]: E0122 14:11:00.171485 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.810932 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" exitCode=0 Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.810974 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14"} Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.811262 4743 scope.go:117] "RemoveContainer" containerID="c01bf0abae2b92d5822357a1785b503f9bc33cb24f77d7df7d49f837030ef253" Jan 22 14:11:00 crc kubenswrapper[4743]: I0122 14:11:00.812215 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:11:00 crc kubenswrapper[4743]: E0122 14:11:00.815081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:11:12 crc kubenswrapper[4743]: I0122 14:11:12.747045 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:11:12 crc kubenswrapper[4743]: E0122 14:11:12.747815 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.066121 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.070146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.080366 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.126043 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.126103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf8kw\" (UniqueName: \"kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.126208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.228151 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.228207 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf8kw\" (UniqueName: \"kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.228273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.228879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.228898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.253877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf8kw\" (UniqueName: \"kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw\") pod \"community-operators-zncx6\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.406595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.839066 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:15 crc kubenswrapper[4743]: W0122 14:11:15.853422 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0312cc_f517_47e9_a8b7_b7a02929c7f7.slice/crio-d3acf3edde5b0d2f05b1dde0fd5bb9de6518c7e9bf4da01f69defa437daa20ff WatchSource:0}: Error finding container d3acf3edde5b0d2f05b1dde0fd5bb9de6518c7e9bf4da01f69defa437daa20ff: Status 404 returned error can't find the container with id d3acf3edde5b0d2f05b1dde0fd5bb9de6518c7e9bf4da01f69defa437daa20ff Jan 22 14:11:15 crc kubenswrapper[4743]: I0122 14:11:15.949235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerStarted","Data":"d3acf3edde5b0d2f05b1dde0fd5bb9de6518c7e9bf4da01f69defa437daa20ff"} Jan 22 14:11:16 crc kubenswrapper[4743]: I0122 14:11:16.958649 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerID="3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1" exitCode=0 Jan 22 14:11:16 crc kubenswrapper[4743]: I0122 14:11:16.958705 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerDied","Data":"3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1"} Jan 22 14:11:17 crc kubenswrapper[4743]: I0122 14:11:17.969856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerStarted","Data":"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43"} Jan 22 14:11:18 crc kubenswrapper[4743]: I0122 14:11:18.981958 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerID="f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43" exitCode=0 Jan 22 14:11:18 crc kubenswrapper[4743]: I0122 14:11:18.982017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerDied","Data":"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43"} Jan 22 14:11:20 crc kubenswrapper[4743]: I0122 14:11:20.998862 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerStarted","Data":"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c"} Jan 22 14:11:21 crc kubenswrapper[4743]: I0122 14:11:21.023723 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zncx6" podStartSLOduration=2.671362276 podStartE2EDuration="6.023661888s" podCreationTimestamp="2026-01-22 14:11:15 +0000 UTC" firstStartedPulling="2026-01-22 14:11:16.960909528 +0000 UTC m=+1513.515952691" lastFinishedPulling="2026-01-22 14:11:20.31320914 +0000 UTC m=+1516.868252303" observedRunningTime="2026-01-22 14:11:21.016556047 +0000 UTC m=+1517.571599220" watchObservedRunningTime="2026-01-22 14:11:21.023661888 +0000 UTC m=+1517.578705051" Jan 22 14:11:23 crc kubenswrapper[4743]: I0122 14:11:23.878998 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:23 crc kubenswrapper[4743]: I0122 14:11:23.881718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:23 crc kubenswrapper[4743]: I0122 14:11:23.896081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.007931 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.008099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.008144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh26j\" (UniqueName: \"kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.110266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh26j\" (UniqueName: \"kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.110397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.110504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.111013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.111032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.130985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh26j\" (UniqueName: \"kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j\") pod \"redhat-marketplace-ggtr8\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.204957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.696601 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:24 crc kubenswrapper[4743]: I0122 14:11:24.747660 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:11:24 crc kubenswrapper[4743]: E0122 14:11:24.747929 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.033395 4743 generic.go:334] "Generic (PLEG): container finished" podID="7753ce68-14e4-4265-a576-2a38a48523e9" containerID="62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc" exitCode=0 Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.033443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerDied","Data":"62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc"} Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.033469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerStarted","Data":"4177b986c50c54637e4982504401b09ea94943e5672f3ce235d0e0d86064f78d"} Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.407396 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.407752 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:25 crc kubenswrapper[4743]: I0122 14:11:25.453984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:26 crc kubenswrapper[4743]: I0122 14:11:26.045154 4743 generic.go:334] "Generic (PLEG): container finished" podID="7753ce68-14e4-4265-a576-2a38a48523e9" containerID="b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc" exitCode=0 Jan 22 14:11:26 crc kubenswrapper[4743]: I0122 14:11:26.046632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerDied","Data":"b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc"} Jan 22 14:11:26 crc kubenswrapper[4743]: I0122 14:11:26.097958 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:27 crc kubenswrapper[4743]: I0122 14:11:27.059014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerStarted","Data":"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07"} Jan 22 14:11:27 crc kubenswrapper[4743]: I0122 14:11:27.077972 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ggtr8" podStartSLOduration=2.624614963 podStartE2EDuration="4.077954342s" podCreationTimestamp="2026-01-22 14:11:23 +0000 UTC" firstStartedPulling="2026-01-22 14:11:25.035488169 +0000 UTC m=+1521.590531342" lastFinishedPulling="2026-01-22 14:11:26.488827558 +0000 UTC m=+1523.043870721" observedRunningTime="2026-01-22 14:11:27.077114589 +0000 UTC m=+1523.632157752" watchObservedRunningTime="2026-01-22 14:11:27.077954342 +0000 UTC m=+1523.632997505" Jan 22 14:11:27 crc kubenswrapper[4743]: I0122 14:11:27.847076 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.068510 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zncx6" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="registry-server" containerID="cri-o://e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c" gracePeriod=2 Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.508760 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.605918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf8kw\" (UniqueName: \"kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw\") pod \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.605960 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities\") pod \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.606284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content\") pod \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\" (UID: \"8b0312cc-f517-47e9-a8b7-b7a02929c7f7\") " Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.607514 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities" (OuterVolumeSpecName: "utilities") pod "8b0312cc-f517-47e9-a8b7-b7a02929c7f7" (UID: "8b0312cc-f517-47e9-a8b7-b7a02929c7f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.617131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw" (OuterVolumeSpecName: "kube-api-access-mf8kw") pod "8b0312cc-f517-47e9-a8b7-b7a02929c7f7" (UID: "8b0312cc-f517-47e9-a8b7-b7a02929c7f7"). InnerVolumeSpecName "kube-api-access-mf8kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.651583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b0312cc-f517-47e9-a8b7-b7a02929c7f7" (UID: "8b0312cc-f517-47e9-a8b7-b7a02929c7f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.708962 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.709010 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf8kw\" (UniqueName: \"kubernetes.io/projected/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-kube-api-access-mf8kw\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:28 crc kubenswrapper[4743]: I0122 14:11:28.709025 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0312cc-f517-47e9-a8b7-b7a02929c7f7-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.081631 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerID="e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c" exitCode=0 Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.081693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerDied","Data":"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c"} Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.081739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zncx6" event={"ID":"8b0312cc-f517-47e9-a8b7-b7a02929c7f7","Type":"ContainerDied","Data":"d3acf3edde5b0d2f05b1dde0fd5bb9de6518c7e9bf4da01f69defa437daa20ff"} Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.081733 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zncx6" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.081757 4743 scope.go:117] "RemoveContainer" containerID="e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.106344 4743 scope.go:117] "RemoveContainer" containerID="f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.129127 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.144819 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zncx6"] Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.157392 4743 scope.go:117] "RemoveContainer" containerID="3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.184693 4743 scope.go:117] "RemoveContainer" containerID="e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c" Jan 22 14:11:29 crc kubenswrapper[4743]: E0122 14:11:29.185149 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c\": container with ID starting with e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c not found: ID does not exist" containerID="e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.185248 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c"} err="failed to get container status \"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c\": rpc error: code = NotFound desc = could not find container \"e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c\": container with ID starting with e6dcee7f9e12f57a30fe43e511796ffc50f2b7a9959bcb3eb0590281f05b377c not found: ID does not exist" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.185343 4743 scope.go:117] "RemoveContainer" containerID="f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43" Jan 22 14:11:29 crc kubenswrapper[4743]: E0122 14:11:29.189334 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43\": container with ID starting with f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43 not found: ID does not exist" containerID="f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.189419 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43"} err="failed to get container status \"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43\": rpc error: code = NotFound desc = could not find container \"f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43\": container with ID starting with f0369d0244cae0544854e3a5c7df62ab26691bbbd784876ab094337f9ba8dd43 not found: ID does not exist" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.189460 4743 scope.go:117] "RemoveContainer" containerID="3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1" Jan 22 14:11:29 crc kubenswrapper[4743]: E0122 14:11:29.190089 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1\": container with ID starting with 3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1 not found: ID does not exist" containerID="3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.190201 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1"} err="failed to get container status \"3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1\": rpc error: code = NotFound desc = could not find container \"3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1\": container with ID starting with 3b94a88e31beba3bec1d3184da11455f96ebda11aa9e096e00788ae0f1c076a1 not found: ID does not exist" Jan 22 14:11:29 crc kubenswrapper[4743]: I0122 14:11:29.759145 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" path="/var/lib/kubelet/pods/8b0312cc-f517-47e9-a8b7-b7a02929c7f7/volumes" Jan 22 14:11:32 crc kubenswrapper[4743]: I0122 14:11:32.856374 4743 scope.go:117] "RemoveContainer" containerID="10a1272b52eb776562992f9eb0adb59661b4af7634fab61a0530e84987e9ef64" Jan 22 14:11:32 crc kubenswrapper[4743]: I0122 14:11:32.888975 4743 scope.go:117] "RemoveContainer" containerID="1f32d65f2422ae8296d9d22400c8ea0dfa0b052566f6b0c341c7ae5d0920ad7c" Jan 22 14:11:32 crc kubenswrapper[4743]: I0122 14:11:32.916817 4743 scope.go:117] "RemoveContainer" containerID="078ac27394f9fff68513124d88e902e9727547466628960c8cdc35f2212382bb" Jan 22 14:11:32 crc kubenswrapper[4743]: I0122 14:11:32.946053 4743 scope.go:117] "RemoveContainer" containerID="713af1b4ec230c16cdb3eddd4bfc35785c2e3378fb5fcaeea472eaec4659461f" Jan 22 14:11:34 crc kubenswrapper[4743]: I0122 14:11:34.205651 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:34 crc kubenswrapper[4743]: I0122 14:11:34.206037 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:34 crc kubenswrapper[4743]: I0122 14:11:34.279860 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:35 crc kubenswrapper[4743]: I0122 14:11:35.192096 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:35 crc kubenswrapper[4743]: I0122 14:11:35.243324 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:36 crc kubenswrapper[4743]: I0122 14:11:36.747729 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:11:36 crc kubenswrapper[4743]: E0122 14:11:36.748026 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.164690 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ggtr8" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="registry-server" containerID="cri-o://e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07" gracePeriod=2 Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.618993 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.681187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities\") pod \"7753ce68-14e4-4265-a576-2a38a48523e9\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.681260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content\") pod \"7753ce68-14e4-4265-a576-2a38a48523e9\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.681422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh26j\" (UniqueName: \"kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j\") pod \"7753ce68-14e4-4265-a576-2a38a48523e9\" (UID: \"7753ce68-14e4-4265-a576-2a38a48523e9\") " Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.682496 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities" (OuterVolumeSpecName: "utilities") pod "7753ce68-14e4-4265-a576-2a38a48523e9" (UID: "7753ce68-14e4-4265-a576-2a38a48523e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.687303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j" (OuterVolumeSpecName: "kube-api-access-sh26j") pod "7753ce68-14e4-4265-a576-2a38a48523e9" (UID: "7753ce68-14e4-4265-a576-2a38a48523e9"). InnerVolumeSpecName "kube-api-access-sh26j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.708279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7753ce68-14e4-4265-a576-2a38a48523e9" (UID: "7753ce68-14e4-4265-a576-2a38a48523e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.783222 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh26j\" (UniqueName: \"kubernetes.io/projected/7753ce68-14e4-4265-a576-2a38a48523e9-kube-api-access-sh26j\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.783258 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:37 crc kubenswrapper[4743]: I0122 14:11:37.783267 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7753ce68-14e4-4265-a576-2a38a48523e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.178625 4743 generic.go:334] "Generic (PLEG): container finished" podID="7753ce68-14e4-4265-a576-2a38a48523e9" containerID="e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07" exitCode=0 Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.178743 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggtr8" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.178761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerDied","Data":"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07"} Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.178844 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggtr8" event={"ID":"7753ce68-14e4-4265-a576-2a38a48523e9","Type":"ContainerDied","Data":"4177b986c50c54637e4982504401b09ea94943e5672f3ce235d0e0d86064f78d"} Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.178878 4743 scope.go:117] "RemoveContainer" containerID="e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.216149 4743 scope.go:117] "RemoveContainer" containerID="b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.231564 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.241682 4743 scope.go:117] "RemoveContainer" containerID="62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.246161 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggtr8"] Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.309569 4743 scope.go:117] "RemoveContainer" containerID="e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07" Jan 22 14:11:38 crc kubenswrapper[4743]: E0122 14:11:38.313481 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07\": container with ID starting with e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07 not found: ID does not exist" containerID="e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.313544 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07"} err="failed to get container status \"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07\": rpc error: code = NotFound desc = could not find container \"e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07\": container with ID starting with e099762dbdeb8685b765eb55eafd2213e1256dcd1e68b9ab8b7d62445e388b07 not found: ID does not exist" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.313585 4743 scope.go:117] "RemoveContainer" containerID="b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc" Jan 22 14:11:38 crc kubenswrapper[4743]: E0122 14:11:38.314293 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc\": container with ID starting with b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc not found: ID does not exist" containerID="b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.314340 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc"} err="failed to get container status \"b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc\": rpc error: code = NotFound desc = could not find container \"b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc\": container with ID starting with b0603c3afd6076b6a082a2bbbc180eac84b1b05bdbd5d840a669fbedb0627acc not found: ID does not exist" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.314371 4743 scope.go:117] "RemoveContainer" containerID="62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc" Jan 22 14:11:38 crc kubenswrapper[4743]: E0122 14:11:38.314727 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc\": container with ID starting with 62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc not found: ID does not exist" containerID="62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc" Jan 22 14:11:38 crc kubenswrapper[4743]: I0122 14:11:38.314774 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc"} err="failed to get container status \"62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc\": rpc error: code = NotFound desc = could not find container \"62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc\": container with ID starting with 62cc01f78cb588ec0f859b93bbbdacbc886daf26a8eeffdd738380e8eb9208dc not found: ID does not exist" Jan 22 14:11:39 crc kubenswrapper[4743]: I0122 14:11:39.761975 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" path="/var/lib/kubelet/pods/7753ce68-14e4-4265-a576-2a38a48523e9/volumes" Jan 22 14:11:50 crc kubenswrapper[4743]: I0122 14:11:50.747988 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:11:50 crc kubenswrapper[4743]: E0122 14:11:50.748843 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:12:03 crc kubenswrapper[4743]: I0122 14:12:03.760114 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:12:03 crc kubenswrapper[4743]: E0122 14:12:03.761635 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:12:17 crc kubenswrapper[4743]: I0122 14:12:17.751303 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:12:17 crc kubenswrapper[4743]: E0122 14:12:17.751880 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:12:28 crc kubenswrapper[4743]: I0122 14:12:28.748312 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:12:28 crc kubenswrapper[4743]: E0122 14:12:28.749807 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:12:33 crc kubenswrapper[4743]: I0122 14:12:33.023401 4743 scope.go:117] "RemoveContainer" containerID="a55b1230b3baf65fa88e2160e58c0c8dc21d88aeeb22d648c9929e5e6fe548ca" Jan 22 14:12:39 crc kubenswrapper[4743]: I0122 14:12:39.747996 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:12:39 crc kubenswrapper[4743]: E0122 14:12:39.748862 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:12:43 crc kubenswrapper[4743]: I0122 14:12:43.817771 4743 generic.go:334] "Generic (PLEG): container finished" podID="33d8b498-a76a-4549-96c2-f32877beaa30" containerID="a0901979f36fec8783443941bfbb6da623c1616ee3524e2e567d7bf07b60a997" exitCode=0 Jan 22 14:12:43 crc kubenswrapper[4743]: I0122 14:12:43.817899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" event={"ID":"33d8b498-a76a-4549-96c2-f32877beaa30","Type":"ContainerDied","Data":"a0901979f36fec8783443941bfbb6da623c1616ee3524e2e567d7bf07b60a997"} Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.247696 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.351101 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam\") pod \"33d8b498-a76a-4549-96c2-f32877beaa30\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.351411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle\") pod \"33d8b498-a76a-4549-96c2-f32877beaa30\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.351491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r48c\" (UniqueName: \"kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c\") pod \"33d8b498-a76a-4549-96c2-f32877beaa30\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.351572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory\") pod \"33d8b498-a76a-4549-96c2-f32877beaa30\" (UID: \"33d8b498-a76a-4549-96c2-f32877beaa30\") " Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.356675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c" (OuterVolumeSpecName: "kube-api-access-9r48c") pod "33d8b498-a76a-4549-96c2-f32877beaa30" (UID: "33d8b498-a76a-4549-96c2-f32877beaa30"). InnerVolumeSpecName "kube-api-access-9r48c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.362974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "33d8b498-a76a-4549-96c2-f32877beaa30" (UID: "33d8b498-a76a-4549-96c2-f32877beaa30"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.383967 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "33d8b498-a76a-4549-96c2-f32877beaa30" (UID: "33d8b498-a76a-4549-96c2-f32877beaa30"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.384313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory" (OuterVolumeSpecName: "inventory") pod "33d8b498-a76a-4549-96c2-f32877beaa30" (UID: "33d8b498-a76a-4549-96c2-f32877beaa30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.453887 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.453921 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.453931 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r48c\" (UniqueName: \"kubernetes.io/projected/33d8b498-a76a-4549-96c2-f32877beaa30-kube-api-access-9r48c\") on node \"crc\" DevicePath \"\"" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.453940 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33d8b498-a76a-4549-96c2-f32877beaa30-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.857703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" event={"ID":"33d8b498-a76a-4549-96c2-f32877beaa30","Type":"ContainerDied","Data":"cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be"} Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.857747 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc6cbbb9921f87444a665946bf597566b9b6542c23f8485914ee74a97a2883be" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.858118 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930181 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz"] Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930821 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930842 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930858 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="extract-content" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930865 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="extract-content" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930880 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d8b498-a76a-4549-96c2-f32877beaa30" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930886 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d8b498-a76a-4549-96c2-f32877beaa30" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930910 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="extract-content" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930916 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="extract-content" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930931 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930938 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930948 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="extract-utilities" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930953 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="extract-utilities" Jan 22 14:12:45 crc kubenswrapper[4743]: E0122 14:12:45.930967 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="extract-utilities" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.930974 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="extract-utilities" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.931136 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d8b498-a76a-4549-96c2-f32877beaa30" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.931147 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7753ce68-14e4-4265-a576-2a38a48523e9" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.931157 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0312cc-f517-47e9-a8b7-b7a02929c7f7" containerName="registry-server" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.931731 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.938575 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.939544 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.939833 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.941248 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:12:45 crc kubenswrapper[4743]: I0122 14:12:45.957441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz"] Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.065860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.065920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p6x4\" (UniqueName: \"kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.065976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.167557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.167662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p6x4\" (UniqueName: \"kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.167741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.171758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.184205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.184718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p6x4\" (UniqueName: \"kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.266691 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.762389 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.768434 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz"] Jan 22 14:12:46 crc kubenswrapper[4743]: I0122 14:12:46.868950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" event={"ID":"6772da1b-97c0-4b18-af50-7723f5dc39b6","Type":"ContainerStarted","Data":"0ca9e2e3ab6d61d6cb4397a499b8c96954929a35a9c10d30aadc7d83fe154f16"} Jan 22 14:12:47 crc kubenswrapper[4743]: I0122 14:12:47.879352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" event={"ID":"6772da1b-97c0-4b18-af50-7723f5dc39b6","Type":"ContainerStarted","Data":"c4e5cfac2878edabf246c77778a7511c939a78b730cdf73d8ab044e1779cf84e"} Jan 22 14:12:47 crc kubenswrapper[4743]: I0122 14:12:47.903902 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" podStartSLOduration=2.51404196 podStartE2EDuration="2.903873884s" podCreationTimestamp="2026-01-22 14:12:45 +0000 UTC" firstStartedPulling="2026-01-22 14:12:46.761810558 +0000 UTC m=+1603.316853721" lastFinishedPulling="2026-01-22 14:12:47.151642482 +0000 UTC m=+1603.706685645" observedRunningTime="2026-01-22 14:12:47.897735439 +0000 UTC m=+1604.452778612" watchObservedRunningTime="2026-01-22 14:12:47.903873884 +0000 UTC m=+1604.458917067" Jan 22 14:12:52 crc kubenswrapper[4743]: I0122 14:12:52.747413 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:12:52 crc kubenswrapper[4743]: E0122 14:12:52.748125 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:13:04 crc kubenswrapper[4743]: I0122 14:13:04.747751 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:13:04 crc kubenswrapper[4743]: E0122 14:13:04.748568 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:13:16 crc kubenswrapper[4743]: I0122 14:13:16.036681 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xwzhs"] Jan 22 14:13:16 crc kubenswrapper[4743]: I0122 14:13:16.045538 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xwzhs"] Jan 22 14:13:17 crc kubenswrapper[4743]: I0122 14:13:17.757475 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41c39cc-ec68-49e7-8144-d58dbccf371b" path="/var/lib/kubelet/pods/e41c39cc-ec68-49e7-8144-d58dbccf371b/volumes" Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.035271 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0148-account-create-update-wmj9x"] Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.048159 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8dx2h"] Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.061079 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7c57-account-create-update-q694h"] Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.072928 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0148-account-create-update-wmj9x"] Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.081203 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8dx2h"] Jan 22 14:13:18 crc kubenswrapper[4743]: I0122 14:13:18.089238 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7c57-account-create-update-q694h"] Jan 22 14:13:19 crc kubenswrapper[4743]: I0122 14:13:19.748470 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:13:19 crc kubenswrapper[4743]: E0122 14:13:19.749074 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:13:19 crc kubenswrapper[4743]: I0122 14:13:19.764572 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3713e3-dd7a-4209-bda1-ce7bcb652e1a" path="/var/lib/kubelet/pods/4a3713e3-dd7a-4209-bda1-ce7bcb652e1a/volumes" Jan 22 14:13:19 crc kubenswrapper[4743]: I0122 14:13:19.765302 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7689c6-7604-44cf-86aa-a317e32537e3" path="/var/lib/kubelet/pods/bf7689c6-7604-44cf-86aa-a317e32537e3/volumes" Jan 22 14:13:19 crc kubenswrapper[4743]: I0122 14:13:19.766049 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4db18a3-97d3-4f11-b1c3-f10626ae1fea" path="/var/lib/kubelet/pods/d4db18a3-97d3-4f11-b1c3-f10626ae1fea/volumes" Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.036468 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pls6v"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.046824 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8xtkr"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.057005 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pls6v"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.065835 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5zstz"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.075026 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a292-account-create-update-zfn7t"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.084447 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8xtkr"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.096604 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a292-account-create-update-zfn7t"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.104249 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5zstz"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.112005 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dbz7j"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.120063 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dbz7j"] Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.759503 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668500b3-5335-4fd9-992b-0b0111284379" path="/var/lib/kubelet/pods/668500b3-5335-4fd9-992b-0b0111284379/volumes" Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.760748 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d531c9-e092-442a-8c4b-044fcf12ac9e" path="/var/lib/kubelet/pods/94d531c9-e092-442a-8c4b-044fcf12ac9e/volumes" Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.761538 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0cba5b6-ee0a-46d1-8a11-d3d841aa820c" path="/var/lib/kubelet/pods/a0cba5b6-ee0a-46d1-8a11-d3d841aa820c/volumes" Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.762370 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c806e874-b77d-4da6-8608-cbfac8bde50a" path="/var/lib/kubelet/pods/c806e874-b77d-4da6-8608-cbfac8bde50a/volumes" Jan 22 14:13:23 crc kubenswrapper[4743]: I0122 14:13:23.763843 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46" path="/var/lib/kubelet/pods/e2f9f2d7-4dc3-45e2-98a0-1058cb22cf46/volumes" Jan 22 14:13:24 crc kubenswrapper[4743]: I0122 14:13:24.029581 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1f94-account-create-update-2br2j"] Jan 22 14:13:24 crc kubenswrapper[4743]: I0122 14:13:24.037438 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-12cb-account-create-update-c67vm"] Jan 22 14:13:24 crc kubenswrapper[4743]: I0122 14:13:24.045468 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-12cb-account-create-update-c67vm"] Jan 22 14:13:24 crc kubenswrapper[4743]: I0122 14:13:24.053505 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1f94-account-create-update-2br2j"] Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.036701 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e41f-account-create-update-4rrgn"] Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.048002 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-27mjs"] Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.058989 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e41f-account-create-update-4rrgn"] Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.069226 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-27mjs"] Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.760470 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43283afa-8819-4f03-90e3-d4aa575dec5a" path="/var/lib/kubelet/pods/43283afa-8819-4f03-90e3-d4aa575dec5a/volumes" Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.761446 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0a522c-1c7c-4b39-94bf-d741ea969082" path="/var/lib/kubelet/pods/7a0a522c-1c7c-4b39-94bf-d741ea969082/volumes" Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.762255 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd" path="/var/lib/kubelet/pods/a1d5430b-aee6-4ecd-a1fa-0f487a63b1bd/volumes" Jan 22 14:13:25 crc kubenswrapper[4743]: I0122 14:13:25.763244 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9" path="/var/lib/kubelet/pods/dd89c7d8-3ba1-4fc1-95cd-0b0faffa30a9/volumes" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.112435 4743 scope.go:117] "RemoveContainer" containerID="dc09aecb4a3e842279a44df7d6af19e634645bb52573663bcc945e520a390975" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.149686 4743 scope.go:117] "RemoveContainer" containerID="75dd059d200de49e276d1e6a2c754dcbc258ee37a2b0c93be0fd84aaa798fcf0" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.182808 4743 scope.go:117] "RemoveContainer" containerID="679fa311004f4d8dad216b47619656bc6856f50316dd8b4c3a1c27880d35db9b" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.221264 4743 scope.go:117] "RemoveContainer" containerID="fd7487ae1682bc8dc72c180745cd112dcf07dd8bb38751a1a686f56520f7e7cb" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.262806 4743 scope.go:117] "RemoveContainer" containerID="731fbfa7cd91f9bb40b16b67aabb061a127fac764176c44112eefde91d0003e5" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.297810 4743 scope.go:117] "RemoveContainer" containerID="137f04123729de75737d5e24287b51fde6e757bda9ebd16d779c957ca49fa24c" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.336496 4743 scope.go:117] "RemoveContainer" containerID="321cbe6ceb2075eae0b98bbf659f9b3c57f058cc02bfbe5e97506a5756b810b7" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.356063 4743 scope.go:117] "RemoveContainer" containerID="1949595cd193cf122023ac76733cceeebeed2e950d0688bbfa604de50157fb04" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.378655 4743 scope.go:117] "RemoveContainer" containerID="8322bd5586c64407f7ad3c207ed2fc535445bd76879430605bb66784313ff434" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.397045 4743 scope.go:117] "RemoveContainer" containerID="50ce7f8f971e46ab830c65837f9e13d105ed15d285f6ae834e28e9fc5f361e8b" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.414919 4743 scope.go:117] "RemoveContainer" containerID="ef7bec575cb03a7715870bb9ca0983afe34e8c417201961051b4a36393c800ed" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.436063 4743 scope.go:117] "RemoveContainer" containerID="253189d4f0506a0db062f798443f85cacf745844b491490253e6940bc075bcb2" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.456749 4743 scope.go:117] "RemoveContainer" containerID="2d1e76033c4d9774442c9d17766c5b95d1c0c6a00ae5c8d3f2e3a3dcc1de4f10" Jan 22 14:13:33 crc kubenswrapper[4743]: I0122 14:13:33.476701 4743 scope.go:117] "RemoveContainer" containerID="1e3b2d87b41ee7e7b367552372673bf101764faea25e45f65fe5e26a8cc02a1a" Jan 22 14:13:35 crc kubenswrapper[4743]: I0122 14:13:35.747847 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:13:35 crc kubenswrapper[4743]: E0122 14:13:35.748346 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:13:48 crc kubenswrapper[4743]: I0122 14:13:48.747021 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:13:48 crc kubenswrapper[4743]: E0122 14:13:48.748714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:13:50 crc kubenswrapper[4743]: I0122 14:13:50.049761 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hvtzl"] Jan 22 14:13:50 crc kubenswrapper[4743]: I0122 14:13:50.060478 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hvtzl"] Jan 22 14:13:51 crc kubenswrapper[4743]: I0122 14:13:51.756976 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13edc29-cd06-4113-8366-75a41988c89f" path="/var/lib/kubelet/pods/c13edc29-cd06-4113-8366-75a41988c89f/volumes" Jan 22 14:14:02 crc kubenswrapper[4743]: I0122 14:14:02.748399 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:14:02 crc kubenswrapper[4743]: E0122 14:14:02.749317 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:14:15 crc kubenswrapper[4743]: I0122 14:14:15.045738 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nm9l6"] Jan 22 14:14:15 crc kubenswrapper[4743]: I0122 14:14:15.056745 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nm9l6"] Jan 22 14:14:15 crc kubenswrapper[4743]: I0122 14:14:15.747159 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:14:15 crc kubenswrapper[4743]: E0122 14:14:15.747482 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:14:15 crc kubenswrapper[4743]: I0122 14:14:15.762337 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9518ef3-f251-4bf9-b45d-0f93876b2e7c" path="/var/lib/kubelet/pods/d9518ef3-f251-4bf9-b45d-0f93876b2e7c/volumes" Jan 22 14:14:26 crc kubenswrapper[4743]: I0122 14:14:26.748129 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:14:26 crc kubenswrapper[4743]: E0122 14:14:26.749167 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:14:31 crc kubenswrapper[4743]: I0122 14:14:31.056134 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fqwwj"] Jan 22 14:14:31 crc kubenswrapper[4743]: I0122 14:14:31.072691 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fqwwj"] Jan 22 14:14:31 crc kubenswrapper[4743]: I0122 14:14:31.762635 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bd2850-37f2-40c9-aeb5-365158ca9716" path="/var/lib/kubelet/pods/b8bd2850-37f2-40c9-aeb5-365158ca9716/volumes" Jan 22 14:14:33 crc kubenswrapper[4743]: I0122 14:14:33.702975 4743 scope.go:117] "RemoveContainer" containerID="2c1755ff78d5f28f816a75363ac12b8522205710fd499d6b66a4f7c2a53a0f2c" Jan 22 14:14:33 crc kubenswrapper[4743]: I0122 14:14:33.750356 4743 scope.go:117] "RemoveContainer" containerID="b43faa67d40c573610da27f85b959e7e2eeb51117c4b36258fbbc29858855c62" Jan 22 14:14:33 crc kubenswrapper[4743]: I0122 14:14:33.821994 4743 scope.go:117] "RemoveContainer" containerID="37b8e4844c7a5a524a68d32dc8c01f0e84365b3babaf0e9d46244f75aa6b4152" Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.051755 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tcdjz"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.065120 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tcdjz"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.073038 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9t996"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.079533 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vfzrn"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.085956 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9t996"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.093121 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vfzrn"] Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.765098 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ebac6d9-df0f-41fe-bc73-8236847ff237" path="/var/lib/kubelet/pods/4ebac6d9-df0f-41fe-bc73-8236847ff237/volumes" Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.766657 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846c118f-23c1-402f-8747-633485e743c9" path="/var/lib/kubelet/pods/846c118f-23c1-402f-8747-633485e743c9/volumes" Jan 22 14:14:37 crc kubenswrapper[4743]: I0122 14:14:37.767758 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe93f39-887c-4949-9e78-1047998f8aff" path="/var/lib/kubelet/pods/cbe93f39-887c-4949-9e78-1047998f8aff/volumes" Jan 22 14:14:41 crc kubenswrapper[4743]: I0122 14:14:41.747731 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:14:41 crc kubenswrapper[4743]: E0122 14:14:41.748884 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:14:53 crc kubenswrapper[4743]: I0122 14:14:53.753713 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:14:53 crc kubenswrapper[4743]: E0122 14:14:53.754539 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:14:54 crc kubenswrapper[4743]: I0122 14:14:54.128170 4743 generic.go:334] "Generic (PLEG): container finished" podID="6772da1b-97c0-4b18-af50-7723f5dc39b6" containerID="c4e5cfac2878edabf246c77778a7511c939a78b730cdf73d8ab044e1779cf84e" exitCode=0 Jan 22 14:14:54 crc kubenswrapper[4743]: I0122 14:14:54.128208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" event={"ID":"6772da1b-97c0-4b18-af50-7723f5dc39b6","Type":"ContainerDied","Data":"c4e5cfac2878edabf246c77778a7511c939a78b730cdf73d8ab044e1779cf84e"} Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.044653 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-srjxw"] Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.053587 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-srjxw"] Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.542100 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.580506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p6x4\" (UniqueName: \"kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4\") pod \"6772da1b-97c0-4b18-af50-7723f5dc39b6\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.580645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam\") pod \"6772da1b-97c0-4b18-af50-7723f5dc39b6\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.580693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory\") pod \"6772da1b-97c0-4b18-af50-7723f5dc39b6\" (UID: \"6772da1b-97c0-4b18-af50-7723f5dc39b6\") " Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.588014 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4" (OuterVolumeSpecName: "kube-api-access-9p6x4") pod "6772da1b-97c0-4b18-af50-7723f5dc39b6" (UID: "6772da1b-97c0-4b18-af50-7723f5dc39b6"). InnerVolumeSpecName "kube-api-access-9p6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.608162 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory" (OuterVolumeSpecName: "inventory") pod "6772da1b-97c0-4b18-af50-7723f5dc39b6" (UID: "6772da1b-97c0-4b18-af50-7723f5dc39b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.611752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6772da1b-97c0-4b18-af50-7723f5dc39b6" (UID: "6772da1b-97c0-4b18-af50-7723f5dc39b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.683390 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p6x4\" (UniqueName: \"kubernetes.io/projected/6772da1b-97c0-4b18-af50-7723f5dc39b6-kube-api-access-9p6x4\") on node \"crc\" DevicePath \"\"" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.683437 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.683452 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6772da1b-97c0-4b18-af50-7723f5dc39b6-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:14:55 crc kubenswrapper[4743]: I0122 14:14:55.760205 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb22345c-594c-46a3-b362-e34baa8f271c" path="/var/lib/kubelet/pods/eb22345c-594c-46a3-b362-e34baa8f271c/volumes" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.145472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" event={"ID":"6772da1b-97c0-4b18-af50-7723f5dc39b6","Type":"ContainerDied","Data":"0ca9e2e3ab6d61d6cb4397a499b8c96954929a35a9c10d30aadc7d83fe154f16"} Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.145830 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca9e2e3ab6d61d6cb4397a499b8c96954929a35a9c10d30aadc7d83fe154f16" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.145528 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.229716 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97"] Jan 22 14:14:56 crc kubenswrapper[4743]: E0122 14:14:56.230410 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6772da1b-97c0-4b18-af50-7723f5dc39b6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.230484 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6772da1b-97c0-4b18-af50-7723f5dc39b6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.230715 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6772da1b-97c0-4b18-af50-7723f5dc39b6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.231418 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.238491 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.238771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.238993 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.239209 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.246829 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97"] Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.294627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.294676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csg8j\" (UniqueName: \"kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.294743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.396824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.396858 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csg8j\" (UniqueName: \"kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.396919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.401394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.405302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.417859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csg8j\" (UniqueName: \"kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xph97\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:56 crc kubenswrapper[4743]: I0122 14:14:56.546851 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:14:57 crc kubenswrapper[4743]: I0122 14:14:57.068943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97"] Jan 22 14:14:57 crc kubenswrapper[4743]: I0122 14:14:57.155021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" event={"ID":"89048557-6c94-40a8-aa26-c9d940743be9","Type":"ContainerStarted","Data":"d75c86f5b021b01e1f4fed4362d51a3ef7b0d6de218b1ccc2e6e2171fac2e35d"} Jan 22 14:14:58 crc kubenswrapper[4743]: I0122 14:14:58.164023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" event={"ID":"89048557-6c94-40a8-aa26-c9d940743be9","Type":"ContainerStarted","Data":"b325f5b5d8d7fcbb9b8157da466f503f9f65600cb87bf18fdc52d90b7534cdf8"} Jan 22 14:14:58 crc kubenswrapper[4743]: I0122 14:14:58.186557 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" podStartSLOduration=1.7596751259999999 podStartE2EDuration="2.186529182s" podCreationTimestamp="2026-01-22 14:14:56 +0000 UTC" firstStartedPulling="2026-01-22 14:14:57.049180632 +0000 UTC m=+1733.604223795" lastFinishedPulling="2026-01-22 14:14:57.476034688 +0000 UTC m=+1734.031077851" observedRunningTime="2026-01-22 14:14:58.178812955 +0000 UTC m=+1734.733856128" watchObservedRunningTime="2026-01-22 14:14:58.186529182 +0000 UTC m=+1734.741572355" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.010536 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.013634 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.019818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.044180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.044270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krz5d\" (UniqueName: \"kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.044431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.146269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krz5d\" (UniqueName: \"kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.146478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.146572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.147113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.147218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.167199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krz5d\" (UniqueName: \"kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d\") pod \"certified-operators-bqpn4\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.348563 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:14:59 crc kubenswrapper[4743]: I0122 14:14:59.867778 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.136341 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj"] Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.137556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.139777 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.141399 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.148311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj"] Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.167537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.167661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.167729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm82p\" (UniqueName: \"kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.186198 4743 generic.go:334] "Generic (PLEG): container finished" podID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerID="389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7" exitCode=0 Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.186273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerDied","Data":"389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7"} Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.188378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerStarted","Data":"29e011ed7beef7cbafee84bf8a3d48ab72fc81bef20c97d2787cfaf25d3f64da"} Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.269824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.270217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.270278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm82p\" (UniqueName: \"kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.271039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.283776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.286541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm82p\" (UniqueName: \"kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p\") pod \"collect-profiles-29484855-fdptj\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.456718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:00 crc kubenswrapper[4743]: I0122 14:15:00.886982 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj"] Jan 22 14:15:01 crc kubenswrapper[4743]: I0122 14:15:01.199231 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" event={"ID":"465f8a75-af56-4af8-ae05-f5468f0aa3c1","Type":"ContainerStarted","Data":"50b6aaff72656fc91afdcbebe97b517b41df0830f1a957b87bec8d3cfc8a9086"} Jan 22 14:15:01 crc kubenswrapper[4743]: I0122 14:15:01.199528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" event={"ID":"465f8a75-af56-4af8-ae05-f5468f0aa3c1","Type":"ContainerStarted","Data":"47a8ff695e4ce9be36c0824201fa0a8faf7ef5225216cedaa833f1166bef90f0"} Jan 22 14:15:01 crc kubenswrapper[4743]: I0122 14:15:01.204433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerStarted","Data":"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8"} Jan 22 14:15:01 crc kubenswrapper[4743]: I0122 14:15:01.220026 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" podStartSLOduration=1.220009178 podStartE2EDuration="1.220009178s" podCreationTimestamp="2026-01-22 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:15:01.214529961 +0000 UTC m=+1737.769573124" watchObservedRunningTime="2026-01-22 14:15:01.220009178 +0000 UTC m=+1737.775052341" Jan 22 14:15:02 crc kubenswrapper[4743]: I0122 14:15:02.215847 4743 generic.go:334] "Generic (PLEG): container finished" podID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerID="4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8" exitCode=0 Jan 22 14:15:02 crc kubenswrapper[4743]: I0122 14:15:02.215974 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerDied","Data":"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8"} Jan 22 14:15:02 crc kubenswrapper[4743]: I0122 14:15:02.219838 4743 generic.go:334] "Generic (PLEG): container finished" podID="465f8a75-af56-4af8-ae05-f5468f0aa3c1" containerID="50b6aaff72656fc91afdcbebe97b517b41df0830f1a957b87bec8d3cfc8a9086" exitCode=0 Jan 22 14:15:02 crc kubenswrapper[4743]: I0122 14:15:02.219902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" event={"ID":"465f8a75-af56-4af8-ae05-f5468f0aa3c1","Type":"ContainerDied","Data":"50b6aaff72656fc91afdcbebe97b517b41df0830f1a957b87bec8d3cfc8a9086"} Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.239021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerStarted","Data":"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b"} Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.263933 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bqpn4" podStartSLOduration=2.833901857 podStartE2EDuration="5.26390653s" podCreationTimestamp="2026-01-22 14:14:58 +0000 UTC" firstStartedPulling="2026-01-22 14:15:00.188177071 +0000 UTC m=+1736.743220234" lastFinishedPulling="2026-01-22 14:15:02.618181744 +0000 UTC m=+1739.173224907" observedRunningTime="2026-01-22 14:15:03.258003151 +0000 UTC m=+1739.813046334" watchObservedRunningTime="2026-01-22 14:15:03.26390653 +0000 UTC m=+1739.818949753" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.615298 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.741674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm82p\" (UniqueName: \"kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p\") pod \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.742018 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume\") pod \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.742078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume\") pod \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\" (UID: \"465f8a75-af56-4af8-ae05-f5468f0aa3c1\") " Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.742503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "465f8a75-af56-4af8-ae05-f5468f0aa3c1" (UID: "465f8a75-af56-4af8-ae05-f5468f0aa3c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.743100 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/465f8a75-af56-4af8-ae05-f5468f0aa3c1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.748520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "465f8a75-af56-4af8-ae05-f5468f0aa3c1" (UID: "465f8a75-af56-4af8-ae05-f5468f0aa3c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.753552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p" (OuterVolumeSpecName: "kube-api-access-qm82p") pod "465f8a75-af56-4af8-ae05-f5468f0aa3c1" (UID: "465f8a75-af56-4af8-ae05-f5468f0aa3c1"). InnerVolumeSpecName "kube-api-access-qm82p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.844937 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm82p\" (UniqueName: \"kubernetes.io/projected/465f8a75-af56-4af8-ae05-f5468f0aa3c1-kube-api-access-qm82p\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:03 crc kubenswrapper[4743]: I0122 14:15:03.844975 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/465f8a75-af56-4af8-ae05-f5468f0aa3c1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:04 crc kubenswrapper[4743]: I0122 14:15:04.248568 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" Jan 22 14:15:04 crc kubenswrapper[4743]: I0122 14:15:04.248579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj" event={"ID":"465f8a75-af56-4af8-ae05-f5468f0aa3c1","Type":"ContainerDied","Data":"47a8ff695e4ce9be36c0824201fa0a8faf7ef5225216cedaa833f1166bef90f0"} Jan 22 14:15:04 crc kubenswrapper[4743]: I0122 14:15:04.250538 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a8ff695e4ce9be36c0824201fa0a8faf7ef5225216cedaa833f1166bef90f0" Jan 22 14:15:08 crc kubenswrapper[4743]: I0122 14:15:08.747902 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:15:08 crc kubenswrapper[4743]: E0122 14:15:08.748716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:15:09 crc kubenswrapper[4743]: I0122 14:15:09.349722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:09 crc kubenswrapper[4743]: I0122 14:15:09.350010 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:09 crc kubenswrapper[4743]: I0122 14:15:09.428104 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:10 crc kubenswrapper[4743]: I0122 14:15:10.369850 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:10 crc kubenswrapper[4743]: I0122 14:15:10.416019 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:15:12 crc kubenswrapper[4743]: I0122 14:15:12.341221 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bqpn4" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="registry-server" containerID="cri-o://a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b" gracePeriod=2 Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.325781 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.357944 4743 generic.go:334] "Generic (PLEG): container finished" podID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerID="a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b" exitCode=0 Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.357986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerDied","Data":"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b"} Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.358012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpn4" event={"ID":"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb","Type":"ContainerDied","Data":"29e011ed7beef7cbafee84bf8a3d48ab72fc81bef20c97d2787cfaf25d3f64da"} Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.358029 4743 scope.go:117] "RemoveContainer" containerID="a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.358374 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpn4" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.381428 4743 scope.go:117] "RemoveContainer" containerID="4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.411826 4743 scope.go:117] "RemoveContainer" containerID="389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.430904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content\") pod \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.430946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities\") pod \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.431099 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krz5d\" (UniqueName: \"kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d\") pod \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\" (UID: \"78ef5f17-3234-47fc-a4ef-1aa134eb6bfb\") " Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.432611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities" (OuterVolumeSpecName: "utilities") pod "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" (UID: "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.438296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d" (OuterVolumeSpecName: "kube-api-access-krz5d") pod "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" (UID: "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb"). InnerVolumeSpecName "kube-api-access-krz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.447715 4743 scope.go:117] "RemoveContainer" containerID="a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b" Jan 22 14:15:13 crc kubenswrapper[4743]: E0122 14:15:13.448144 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b\": container with ID starting with a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b not found: ID does not exist" containerID="a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.448194 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b"} err="failed to get container status \"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b\": rpc error: code = NotFound desc = could not find container \"a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b\": container with ID starting with a6f651c86b1825ab21045e4c66befb953e94fe6aed2d37386ad88163eab1806b not found: ID does not exist" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.448223 4743 scope.go:117] "RemoveContainer" containerID="4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8" Jan 22 14:15:13 crc kubenswrapper[4743]: E0122 14:15:13.448494 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8\": container with ID starting with 4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8 not found: ID does not exist" containerID="4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.448517 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8"} err="failed to get container status \"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8\": rpc error: code = NotFound desc = could not find container \"4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8\": container with ID starting with 4d96ab8815ead712cad9995c1a272881ddf9265785648be462a8c90c005694d8 not found: ID does not exist" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.448532 4743 scope.go:117] "RemoveContainer" containerID="389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7" Jan 22 14:15:13 crc kubenswrapper[4743]: E0122 14:15:13.448774 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7\": container with ID starting with 389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7 not found: ID does not exist" containerID="389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.448805 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7"} err="failed to get container status \"389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7\": rpc error: code = NotFound desc = could not find container \"389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7\": container with ID starting with 389273365a66c3a8923153a921a0951f525f2ed3ebf4a10377888d08b1beafb7 not found: ID does not exist" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.473994 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" (UID: "78ef5f17-3234-47fc-a4ef-1aa134eb6bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.534017 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.534341 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.534350 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krz5d\" (UniqueName: \"kubernetes.io/projected/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb-kube-api-access-krz5d\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.692851 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.700629 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bqpn4"] Jan 22 14:15:13 crc kubenswrapper[4743]: I0122 14:15:13.758100 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" path="/var/lib/kubelet/pods/78ef5f17-3234-47fc-a4ef-1aa134eb6bfb/volumes" Jan 22 14:15:20 crc kubenswrapper[4743]: I0122 14:15:20.748849 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:15:20 crc kubenswrapper[4743]: E0122 14:15:20.750654 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:15:33 crc kubenswrapper[4743]: I0122 14:15:33.756878 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:15:33 crc kubenswrapper[4743]: E0122 14:15:33.757729 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:15:33 crc kubenswrapper[4743]: I0122 14:15:33.947175 4743 scope.go:117] "RemoveContainer" containerID="c871dc8973ea2e91fd44264fe34f88f81d8c63e19143a693c0892bad127fbe86" Jan 22 14:15:33 crc kubenswrapper[4743]: I0122 14:15:33.991258 4743 scope.go:117] "RemoveContainer" containerID="795289613c94db2b8fe8ac8e26657900a12882529e4bcc81dddcdcda9648ea1d" Jan 22 14:15:34 crc kubenswrapper[4743]: I0122 14:15:34.033044 4743 scope.go:117] "RemoveContainer" containerID="cc13199c0a241e6bacabaf8f4c251225db6761a2f3d5aaee1461d5ab8520ddbe" Jan 22 14:15:34 crc kubenswrapper[4743]: I0122 14:15:34.067188 4743 scope.go:117] "RemoveContainer" containerID="c21f41d3b2d4091b7aa24b83cec34f0c483edd38f0a2db502df9a85b60d4aec0" Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.038442 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bkr2b"] Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.047320 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hpj2q"] Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.056882 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bkr2b"] Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.064009 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hpj2q"] Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.762733 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da15c36a-41ce-424b-a5bf-e6fec2d1b4c7" path="/var/lib/kubelet/pods/da15c36a-41ce-424b-a5bf-e6fec2d1b4c7/volumes" Jan 22 14:15:39 crc kubenswrapper[4743]: I0122 14:15:39.763848 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55bf894-5195-430c-acc8-06875cadcdff" path="/var/lib/kubelet/pods/f55bf894-5195-430c-acc8-06875cadcdff/volumes" Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.033111 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7a10-account-create-update-7hg8n"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.041508 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tbj22"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.051704 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-586d-account-create-update-nl5vc"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.063049 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ec5d-account-create-update-p4vn8"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.073496 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ec5d-account-create-update-p4vn8"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.081202 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tbj22"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.089442 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7a10-account-create-update-7hg8n"] Jan 22 14:15:40 crc kubenswrapper[4743]: I0122 14:15:40.097675 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-586d-account-create-update-nl5vc"] Jan 22 14:15:41 crc kubenswrapper[4743]: I0122 14:15:41.762327 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07bcd65e-dcf5-4778-b34d-fba3728e8616" path="/var/lib/kubelet/pods/07bcd65e-dcf5-4778-b34d-fba3728e8616/volumes" Jan 22 14:15:41 crc kubenswrapper[4743]: I0122 14:15:41.763033 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adbc0d6-0108-4060-9299-7d71187ac9e9" path="/var/lib/kubelet/pods/1adbc0d6-0108-4060-9299-7d71187ac9e9/volumes" Jan 22 14:15:41 crc kubenswrapper[4743]: I0122 14:15:41.763532 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25cb1f70-dfe0-422d-95f5-b4e7ea4a8004" path="/var/lib/kubelet/pods/25cb1f70-dfe0-422d-95f5-b4e7ea4a8004/volumes" Jan 22 14:15:41 crc kubenswrapper[4743]: I0122 14:15:41.764048 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e926de-d57c-4d5b-82a6-2a28f645f18d" path="/var/lib/kubelet/pods/43e926de-d57c-4d5b-82a6-2a28f645f18d/volumes" Jan 22 14:15:45 crc kubenswrapper[4743]: I0122 14:15:45.747028 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:15:45 crc kubenswrapper[4743]: E0122 14:15:45.747751 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:16:00 crc kubenswrapper[4743]: I0122 14:16:00.748019 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:16:01 crc kubenswrapper[4743]: I0122 14:16:01.799621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7"} Jan 22 14:16:12 crc kubenswrapper[4743]: I0122 14:16:12.044070 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hn577"] Jan 22 14:16:12 crc kubenswrapper[4743]: I0122 14:16:12.054280 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hn577"] Jan 22 14:16:13 crc kubenswrapper[4743]: I0122 14:16:13.757728 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce464499-6235-4e9e-b2ef-02dcc568f613" path="/var/lib/kubelet/pods/ce464499-6235-4e9e-b2ef-02dcc568f613/volumes" Jan 22 14:16:23 crc kubenswrapper[4743]: I0122 14:16:23.004598 4743 generic.go:334] "Generic (PLEG): container finished" podID="89048557-6c94-40a8-aa26-c9d940743be9" containerID="b325f5b5d8d7fcbb9b8157da466f503f9f65600cb87bf18fdc52d90b7534cdf8" exitCode=0 Jan 22 14:16:23 crc kubenswrapper[4743]: I0122 14:16:23.004865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" event={"ID":"89048557-6c94-40a8-aa26-c9d940743be9","Type":"ContainerDied","Data":"b325f5b5d8d7fcbb9b8157da466f503f9f65600cb87bf18fdc52d90b7534cdf8"} Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.488862 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.556957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory\") pod \"89048557-6c94-40a8-aa26-c9d940743be9\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.557022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csg8j\" (UniqueName: \"kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j\") pod \"89048557-6c94-40a8-aa26-c9d940743be9\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.557252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam\") pod \"89048557-6c94-40a8-aa26-c9d940743be9\" (UID: \"89048557-6c94-40a8-aa26-c9d940743be9\") " Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.563222 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j" (OuterVolumeSpecName: "kube-api-access-csg8j") pod "89048557-6c94-40a8-aa26-c9d940743be9" (UID: "89048557-6c94-40a8-aa26-c9d940743be9"). InnerVolumeSpecName "kube-api-access-csg8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.583058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "89048557-6c94-40a8-aa26-c9d940743be9" (UID: "89048557-6c94-40a8-aa26-c9d940743be9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.584314 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory" (OuterVolumeSpecName: "inventory") pod "89048557-6c94-40a8-aa26-c9d940743be9" (UID: "89048557-6c94-40a8-aa26-c9d940743be9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.664313 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.664535 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89048557-6c94-40a8-aa26-c9d940743be9-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:24 crc kubenswrapper[4743]: I0122 14:16:24.664545 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csg8j\" (UniqueName: \"kubernetes.io/projected/89048557-6c94-40a8-aa26-c9d940743be9-kube-api-access-csg8j\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.021853 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" event={"ID":"89048557-6c94-40a8-aa26-c9d940743be9","Type":"ContainerDied","Data":"d75c86f5b021b01e1f4fed4362d51a3ef7b0d6de218b1ccc2e6e2171fac2e35d"} Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.021901 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d75c86f5b021b01e1f4fed4362d51a3ef7b0d6de218b1ccc2e6e2171fac2e35d" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.021949 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xph97" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112158 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh"] Jan 22 14:16:25 crc kubenswrapper[4743]: E0122 14:16:25.112709 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="extract-utilities" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112731 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="extract-utilities" Jan 22 14:16:25 crc kubenswrapper[4743]: E0122 14:16:25.112757 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="extract-content" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112767 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="extract-content" Jan 22 14:16:25 crc kubenswrapper[4743]: E0122 14:16:25.112785 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="registry-server" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="registry-server" Jan 22 14:16:25 crc kubenswrapper[4743]: E0122 14:16:25.112831 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465f8a75-af56-4af8-ae05-f5468f0aa3c1" containerName="collect-profiles" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112840 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="465f8a75-af56-4af8-ae05-f5468f0aa3c1" containerName="collect-profiles" Jan 22 14:16:25 crc kubenswrapper[4743]: E0122 14:16:25.112861 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89048557-6c94-40a8-aa26-c9d940743be9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.112872 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="89048557-6c94-40a8-aa26-c9d940743be9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.113122 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ef5f17-3234-47fc-a4ef-1aa134eb6bfb" containerName="registry-server" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.113151 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="465f8a75-af56-4af8-ae05-f5468f0aa3c1" containerName="collect-profiles" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.113166 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="89048557-6c94-40a8-aa26-c9d940743be9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.113872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.116562 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.117607 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.119117 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.119388 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.122827 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh"] Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.174189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.174250 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6zl\" (UniqueName: \"kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.174316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.276085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.276148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6zl\" (UniqueName: \"kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.276199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.286643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.286813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.310489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6zl\" (UniqueName: \"kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-288lh\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.444189 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:25 crc kubenswrapper[4743]: I0122 14:16:25.949340 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh"] Jan 22 14:16:26 crc kubenswrapper[4743]: I0122 14:16:26.031464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" event={"ID":"600f8b94-291e-4c03-b5d8-75f43de51d1d","Type":"ContainerStarted","Data":"19b4e951e264d249dc86af5b3cf1ea6bb5f9688e7e4610d1e6860f3024c9a093"} Jan 22 14:16:27 crc kubenswrapper[4743]: I0122 14:16:27.040980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" event={"ID":"600f8b94-291e-4c03-b5d8-75f43de51d1d","Type":"ContainerStarted","Data":"4264c16d3bee4eb8b1a519592331d0d6bf2aa275398eb09a7f0a65aa8a743893"} Jan 22 14:16:27 crc kubenswrapper[4743]: I0122 14:16:27.080683 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" podStartSLOduration=1.627862542 podStartE2EDuration="2.080658235s" podCreationTimestamp="2026-01-22 14:16:25 +0000 UTC" firstStartedPulling="2026-01-22 14:16:25.95248457 +0000 UTC m=+1822.507527733" lastFinishedPulling="2026-01-22 14:16:26.405280263 +0000 UTC m=+1822.960323426" observedRunningTime="2026-01-22 14:16:27.057075282 +0000 UTC m=+1823.612118445" watchObservedRunningTime="2026-01-22 14:16:27.080658235 +0000 UTC m=+1823.635701408" Jan 22 14:16:32 crc kubenswrapper[4743]: I0122 14:16:32.094244 4743 generic.go:334] "Generic (PLEG): container finished" podID="600f8b94-291e-4c03-b5d8-75f43de51d1d" containerID="4264c16d3bee4eb8b1a519592331d0d6bf2aa275398eb09a7f0a65aa8a743893" exitCode=0 Jan 22 14:16:32 crc kubenswrapper[4743]: I0122 14:16:32.094351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" event={"ID":"600f8b94-291e-4c03-b5d8-75f43de51d1d","Type":"ContainerDied","Data":"4264c16d3bee4eb8b1a519592331d0d6bf2aa275398eb09a7f0a65aa8a743893"} Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.483378 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.527643 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg6zl\" (UniqueName: \"kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl\") pod \"600f8b94-291e-4c03-b5d8-75f43de51d1d\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.527801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam\") pod \"600f8b94-291e-4c03-b5d8-75f43de51d1d\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.527901 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory\") pod \"600f8b94-291e-4c03-b5d8-75f43de51d1d\" (UID: \"600f8b94-291e-4c03-b5d8-75f43de51d1d\") " Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.533324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl" (OuterVolumeSpecName: "kube-api-access-kg6zl") pod "600f8b94-291e-4c03-b5d8-75f43de51d1d" (UID: "600f8b94-291e-4c03-b5d8-75f43de51d1d"). InnerVolumeSpecName "kube-api-access-kg6zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.559059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory" (OuterVolumeSpecName: "inventory") pod "600f8b94-291e-4c03-b5d8-75f43de51d1d" (UID: "600f8b94-291e-4c03-b5d8-75f43de51d1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.560411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "600f8b94-291e-4c03-b5d8-75f43de51d1d" (UID: "600f8b94-291e-4c03-b5d8-75f43de51d1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.629866 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.630071 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg6zl\" (UniqueName: \"kubernetes.io/projected/600f8b94-291e-4c03-b5d8-75f43de51d1d-kube-api-access-kg6zl\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:33 crc kubenswrapper[4743]: I0122 14:16:33.630083 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/600f8b94-291e-4c03-b5d8-75f43de51d1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.116173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" event={"ID":"600f8b94-291e-4c03-b5d8-75f43de51d1d","Type":"ContainerDied","Data":"19b4e951e264d249dc86af5b3cf1ea6bb5f9688e7e4610d1e6860f3024c9a093"} Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.116222 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-288lh" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.116226 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b4e951e264d249dc86af5b3cf1ea6bb5f9688e7e4610d1e6860f3024c9a093" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.177667 4743 scope.go:117] "RemoveContainer" containerID="01f8d4d2a50491c06be70fe555b84c5e85bf74d4aed887ef308511035a4ffa62" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.190481 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg"] Jan 22 14:16:34 crc kubenswrapper[4743]: E0122 14:16:34.204382 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600f8b94-291e-4c03-b5d8-75f43de51d1d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.204419 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="600f8b94-291e-4c03-b5d8-75f43de51d1d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.204615 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="600f8b94-291e-4c03-b5d8-75f43de51d1d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.205193 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg"] Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.205298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.209860 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.210069 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.210222 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.210362 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.213868 4743 scope.go:117] "RemoveContainer" containerID="c286824626c99fb36a15a3a6dff8a06ee8a98c682a4082ac3628d6edeedf93d5" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.240387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.240617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkwv\" (UniqueName: \"kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.240754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.253598 4743 scope.go:117] "RemoveContainer" containerID="3f0ef25b0f166c44d882e33caca448af05e5012c1c3d4874df3144745b40e14d" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.270991 4743 scope.go:117] "RemoveContainer" containerID="eb5eddb366431a8736a9dbf21c3b9f61087925ac7db940db6d13163edfb617a1" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.291372 4743 scope.go:117] "RemoveContainer" containerID="7a7f8a69879f301768b68f718413117c84dd5bc97ec0ad8caa39ec95a947b452" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.330353 4743 scope.go:117] "RemoveContainer" containerID="24fea48753ce9db7d55094d0a3530e1a63382dafd66143b0b797c8331de6408a" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.343142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.343256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkwv\" (UniqueName: \"kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.343305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.349908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.350958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.351680 4743 scope.go:117] "RemoveContainer" containerID="65a73678a5ad78e7585173de169f2d7a6d65b25e788a3d5d7c9fcee111fcd45d" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.362691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkwv\" (UniqueName: \"kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z8mwg\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.550093 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:16:34 crc kubenswrapper[4743]: I0122 14:16:34.851497 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg"] Jan 22 14:16:35 crc kubenswrapper[4743]: I0122 14:16:35.129245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" event={"ID":"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb","Type":"ContainerStarted","Data":"ad759726c01d14ecc83abea250b6f24cb30ba6b46dd98ad6a5a5dc03ad2489df"} Jan 22 14:16:36 crc kubenswrapper[4743]: I0122 14:16:36.139955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" event={"ID":"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb","Type":"ContainerStarted","Data":"6dc2b32baff18b5c379b2e6a382dc9fc87b6de30c8e06d6cd91238bb5d5d9850"} Jan 22 14:16:36 crc kubenswrapper[4743]: I0122 14:16:36.157335 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" podStartSLOduration=1.65839039 podStartE2EDuration="2.157316592s" podCreationTimestamp="2026-01-22 14:16:34 +0000 UTC" firstStartedPulling="2026-01-22 14:16:34.857619779 +0000 UTC m=+1831.412662942" lastFinishedPulling="2026-01-22 14:16:35.356545971 +0000 UTC m=+1831.911589144" observedRunningTime="2026-01-22 14:16:36.155665027 +0000 UTC m=+1832.710708230" watchObservedRunningTime="2026-01-22 14:16:36.157316592 +0000 UTC m=+1832.712359755" Jan 22 14:16:38 crc kubenswrapper[4743]: I0122 14:16:38.045257 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9bfnc"] Jan 22 14:16:38 crc kubenswrapper[4743]: I0122 14:16:38.055033 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9bfnc"] Jan 22 14:16:39 crc kubenswrapper[4743]: I0122 14:16:39.760127 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68eb078f-0a0b-4463-98e7-fb2dc396ca6f" path="/var/lib/kubelet/pods/68eb078f-0a0b-4463-98e7-fb2dc396ca6f/volumes" Jan 22 14:16:49 crc kubenswrapper[4743]: I0122 14:16:49.044465 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7nqwj"] Jan 22 14:16:49 crc kubenswrapper[4743]: I0122 14:16:49.052301 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7nqwj"] Jan 22 14:16:49 crc kubenswrapper[4743]: I0122 14:16:49.765233 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9b8754-ae09-4ea6-ba23-88227365b34b" path="/var/lib/kubelet/pods/ef9b8754-ae09-4ea6-ba23-88227365b34b/volumes" Jan 22 14:17:13 crc kubenswrapper[4743]: I0122 14:17:13.473682 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" containerID="6dc2b32baff18b5c379b2e6a382dc9fc87b6de30c8e06d6cd91238bb5d5d9850" exitCode=0 Jan 22 14:17:13 crc kubenswrapper[4743]: I0122 14:17:13.473772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" event={"ID":"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb","Type":"ContainerDied","Data":"6dc2b32baff18b5c379b2e6a382dc9fc87b6de30c8e06d6cd91238bb5d5d9850"} Jan 22 14:17:14 crc kubenswrapper[4743]: I0122 14:17:14.876433 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.021729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam\") pod \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.021965 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory\") pod \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.022004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlkwv\" (UniqueName: \"kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv\") pod \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\" (UID: \"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb\") " Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.027296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv" (OuterVolumeSpecName: "kube-api-access-hlkwv") pod "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" (UID: "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb"). InnerVolumeSpecName "kube-api-access-hlkwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.047445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory" (OuterVolumeSpecName: "inventory") pod "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" (UID: "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.072232 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" (UID: "9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.123815 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.124037 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlkwv\" (UniqueName: \"kubernetes.io/projected/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-kube-api-access-hlkwv\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.124127 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.491276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" event={"ID":"9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb","Type":"ContainerDied","Data":"ad759726c01d14ecc83abea250b6f24cb30ba6b46dd98ad6a5a5dc03ad2489df"} Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.491982 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad759726c01d14ecc83abea250b6f24cb30ba6b46dd98ad6a5a5dc03ad2489df" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.491420 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z8mwg" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.584715 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs"] Jan 22 14:17:15 crc kubenswrapper[4743]: E0122 14:17:15.585392 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.585416 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.585627 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.586325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.588413 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.588983 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.589022 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.589154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.617664 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs"] Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.734816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.734962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.734991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ln9\" (UniqueName: \"kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.836900 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.836948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ln9\" (UniqueName: \"kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.837056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.843863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.851640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.854741 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ln9\" (UniqueName: \"kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:15 crc kubenswrapper[4743]: I0122 14:17:15.901942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:17:16 crc kubenswrapper[4743]: I0122 14:17:16.392884 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs"] Jan 22 14:17:16 crc kubenswrapper[4743]: I0122 14:17:16.500606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" event={"ID":"6cf93e6b-adef-48fb-844b-a420be87fd2e","Type":"ContainerStarted","Data":"64708f9374a8e8c95d25224fc1e46d97832768b7c2a1af6c6ca745bc88d1918c"} Jan 22 14:17:17 crc kubenswrapper[4743]: I0122 14:17:17.510938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" event={"ID":"6cf93e6b-adef-48fb-844b-a420be87fd2e","Type":"ContainerStarted","Data":"745c0fe526130df4f0ee20ed86ffed0651fdebfe3359ae57f65092f479b1a2c2"} Jan 22 14:17:17 crc kubenswrapper[4743]: I0122 14:17:17.529919 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" podStartSLOduration=2.071599117 podStartE2EDuration="2.529898068s" podCreationTimestamp="2026-01-22 14:17:15 +0000 UTC" firstStartedPulling="2026-01-22 14:17:16.403315516 +0000 UTC m=+1872.958358679" lastFinishedPulling="2026-01-22 14:17:16.861614467 +0000 UTC m=+1873.416657630" observedRunningTime="2026-01-22 14:17:17.525931842 +0000 UTC m=+1874.080975005" watchObservedRunningTime="2026-01-22 14:17:17.529898068 +0000 UTC m=+1874.084941231" Jan 22 14:17:24 crc kubenswrapper[4743]: I0122 14:17:24.035521 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dd884"] Jan 22 14:17:24 crc kubenswrapper[4743]: I0122 14:17:24.044078 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dd884"] Jan 22 14:17:25 crc kubenswrapper[4743]: I0122 14:17:25.758698 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f715db-df3f-479c-82b4-0e8bdea14dba" path="/var/lib/kubelet/pods/a4f715db-df3f-479c-82b4-0e8bdea14dba/volumes" Jan 22 14:17:34 crc kubenswrapper[4743]: I0122 14:17:34.509887 4743 scope.go:117] "RemoveContainer" containerID="7af5309d332d6c839b3ad6b684db43b57e15dd2e7b57e4ac92b7a54aca9c658d" Jan 22 14:17:34 crc kubenswrapper[4743]: I0122 14:17:34.572049 4743 scope.go:117] "RemoveContainer" containerID="cf9992f13f04eeaf2a694880869e9157cce34bd8f95d73c2b9126bc213ca7068" Jan 22 14:17:34 crc kubenswrapper[4743]: I0122 14:17:34.631193 4743 scope.go:117] "RemoveContainer" containerID="d44713eea626495a42e8b912c7d8fd7f5c1a4b00ab4eeb4457c90bea882a7ffe" Jan 22 14:18:00 crc kubenswrapper[4743]: I0122 14:18:00.049720 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:18:00 crc kubenswrapper[4743]: I0122 14:18:00.050391 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:18:06 crc kubenswrapper[4743]: I0122 14:18:06.931398 4743 generic.go:334] "Generic (PLEG): container finished" podID="6cf93e6b-adef-48fb-844b-a420be87fd2e" containerID="745c0fe526130df4f0ee20ed86ffed0651fdebfe3359ae57f65092f479b1a2c2" exitCode=0 Jan 22 14:18:06 crc kubenswrapper[4743]: I0122 14:18:06.931511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" event={"ID":"6cf93e6b-adef-48fb-844b-a420be87fd2e","Type":"ContainerDied","Data":"745c0fe526130df4f0ee20ed86ffed0651fdebfe3359ae57f65092f479b1a2c2"} Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.316387 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.417044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory\") pod \"6cf93e6b-adef-48fb-844b-a420be87fd2e\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.417130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam\") pod \"6cf93e6b-adef-48fb-844b-a420be87fd2e\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.417274 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ln9\" (UniqueName: \"kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9\") pod \"6cf93e6b-adef-48fb-844b-a420be87fd2e\" (UID: \"6cf93e6b-adef-48fb-844b-a420be87fd2e\") " Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.423999 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9" (OuterVolumeSpecName: "kube-api-access-q6ln9") pod "6cf93e6b-adef-48fb-844b-a420be87fd2e" (UID: "6cf93e6b-adef-48fb-844b-a420be87fd2e"). InnerVolumeSpecName "kube-api-access-q6ln9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.452476 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6cf93e6b-adef-48fb-844b-a420be87fd2e" (UID: "6cf93e6b-adef-48fb-844b-a420be87fd2e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.453602 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory" (OuterVolumeSpecName: "inventory") pod "6cf93e6b-adef-48fb-844b-a420be87fd2e" (UID: "6cf93e6b-adef-48fb-844b-a420be87fd2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.491053 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:08 crc kubenswrapper[4743]: E0122 14:18:08.491490 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf93e6b-adef-48fb-844b-a420be87fd2e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.491511 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf93e6b-adef-48fb-844b-a420be87fd2e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.491748 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf93e6b-adef-48fb-844b-a420be87fd2e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.494559 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.525440 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.525485 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cf93e6b-adef-48fb-844b-a420be87fd2e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.525503 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ln9\" (UniqueName: \"kubernetes.io/projected/6cf93e6b-adef-48fb-844b-a420be87fd2e-kube-api-access-q6ln9\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.533414 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.626669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.626777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664v4\" (UniqueName: \"kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.626834 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.728746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.728850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664v4\" (UniqueName: \"kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.728887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.729347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.729392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.748687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664v4\" (UniqueName: \"kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4\") pod \"redhat-operators-7x9ml\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.843864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.952692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" event={"ID":"6cf93e6b-adef-48fb-844b-a420be87fd2e","Type":"ContainerDied","Data":"64708f9374a8e8c95d25224fc1e46d97832768b7c2a1af6c6ca745bc88d1918c"} Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.953065 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64708f9374a8e8c95d25224fc1e46d97832768b7c2a1af6c6ca745bc88d1918c" Jan 22 14:18:08 crc kubenswrapper[4743]: I0122 14:18:08.954262 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.058914 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jvd8s"] Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.060349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.063358 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.063552 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.067283 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.067928 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.070005 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jvd8s"] Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.247183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.247261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.247293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknbv\" (UniqueName: \"kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.333336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.348924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.348988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.349010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknbv\" (UniqueName: \"kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.354996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.358348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.371634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknbv\" (UniqueName: \"kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv\") pod \"ssh-known-hosts-edpm-deployment-jvd8s\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.392446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.931915 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jvd8s"] Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.936059 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.960610 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerID="39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39" exitCode=0 Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.960681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerDied","Data":"39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39"} Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.960708 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerStarted","Data":"ab8b9069204cdfc6b0b41d43a6b47d88b9fc63c06d6a9f324f0866fca123aebf"} Jan 22 14:18:09 crc kubenswrapper[4743]: I0122 14:18:09.961597 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" event={"ID":"c41f0818-52ad-4c25-82fa-61a14a9825a1","Type":"ContainerStarted","Data":"27c5be8016ced2cca061c6c2119a064b73c3a99a23ac204b8f08ca54a18c1a9b"} Jan 22 14:18:10 crc kubenswrapper[4743]: I0122 14:18:10.972901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" event={"ID":"c41f0818-52ad-4c25-82fa-61a14a9825a1","Type":"ContainerStarted","Data":"778a641323030afe26aa859eb4ce25efc9e049cd8adda469fa47dbea014c1fca"} Jan 22 14:18:10 crc kubenswrapper[4743]: I0122 14:18:10.994521 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" podStartSLOduration=1.3101469 podStartE2EDuration="1.994500811s" podCreationTimestamp="2026-01-22 14:18:09 +0000 UTC" firstStartedPulling="2026-01-22 14:18:09.935872195 +0000 UTC m=+1926.490915358" lastFinishedPulling="2026-01-22 14:18:10.620226086 +0000 UTC m=+1927.175269269" observedRunningTime="2026-01-22 14:18:10.990862143 +0000 UTC m=+1927.545905306" watchObservedRunningTime="2026-01-22 14:18:10.994500811 +0000 UTC m=+1927.549543994" Jan 22 14:18:11 crc kubenswrapper[4743]: I0122 14:18:11.982714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerStarted","Data":"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183"} Jan 22 14:18:17 crc kubenswrapper[4743]: I0122 14:18:17.029484 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerID="da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183" exitCode=0 Jan 22 14:18:17 crc kubenswrapper[4743]: I0122 14:18:17.029566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerDied","Data":"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183"} Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.040300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerStarted","Data":"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532"} Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.041978 4743 generic.go:334] "Generic (PLEG): container finished" podID="c41f0818-52ad-4c25-82fa-61a14a9825a1" containerID="778a641323030afe26aa859eb4ce25efc9e049cd8adda469fa47dbea014c1fca" exitCode=0 Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.042017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" event={"ID":"c41f0818-52ad-4c25-82fa-61a14a9825a1","Type":"ContainerDied","Data":"778a641323030afe26aa859eb4ce25efc9e049cd8adda469fa47dbea014c1fca"} Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.070259 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7x9ml" podStartSLOduration=2.451453248 podStartE2EDuration="10.070236023s" podCreationTimestamp="2026-01-22 14:18:08 +0000 UTC" firstStartedPulling="2026-01-22 14:18:09.962319148 +0000 UTC m=+1926.517362311" lastFinishedPulling="2026-01-22 14:18:17.581101923 +0000 UTC m=+1934.136145086" observedRunningTime="2026-01-22 14:18:18.05934372 +0000 UTC m=+1934.614386883" watchObservedRunningTime="2026-01-22 14:18:18.070236023 +0000 UTC m=+1934.625279186" Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.845527 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:18 crc kubenswrapper[4743]: I0122 14:18:18.845577 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.425012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.545582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0\") pod \"c41f0818-52ad-4c25-82fa-61a14a9825a1\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.545960 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vknbv\" (UniqueName: \"kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv\") pod \"c41f0818-52ad-4c25-82fa-61a14a9825a1\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.546265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam\") pod \"c41f0818-52ad-4c25-82fa-61a14a9825a1\" (UID: \"c41f0818-52ad-4c25-82fa-61a14a9825a1\") " Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.553064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv" (OuterVolumeSpecName: "kube-api-access-vknbv") pod "c41f0818-52ad-4c25-82fa-61a14a9825a1" (UID: "c41f0818-52ad-4c25-82fa-61a14a9825a1"). InnerVolumeSpecName "kube-api-access-vknbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.574592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c41f0818-52ad-4c25-82fa-61a14a9825a1" (UID: "c41f0818-52ad-4c25-82fa-61a14a9825a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.584685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c41f0818-52ad-4c25-82fa-61a14a9825a1" (UID: "c41f0818-52ad-4c25-82fa-61a14a9825a1"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.648430 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.648650 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vknbv\" (UniqueName: \"kubernetes.io/projected/c41f0818-52ad-4c25-82fa-61a14a9825a1-kube-api-access-vknbv\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.648661 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c41f0818-52ad-4c25-82fa-61a14a9825a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:19 crc kubenswrapper[4743]: I0122 14:18:19.893996 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7x9ml" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="registry-server" probeResult="failure" output=< Jan 22 14:18:19 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 14:18:19 crc kubenswrapper[4743]: > Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.062766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" event={"ID":"c41f0818-52ad-4c25-82fa-61a14a9825a1","Type":"ContainerDied","Data":"27c5be8016ced2cca061c6c2119a064b73c3a99a23ac204b8f08ca54a18c1a9b"} Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.062824 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c5be8016ced2cca061c6c2119a064b73c3a99a23ac204b8f08ca54a18c1a9b" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.062850 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jvd8s" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.177092 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw"] Jan 22 14:18:20 crc kubenswrapper[4743]: E0122 14:18:20.177600 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41f0818-52ad-4c25-82fa-61a14a9825a1" containerName="ssh-known-hosts-edpm-deployment" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.177625 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41f0818-52ad-4c25-82fa-61a14a9825a1" containerName="ssh-known-hosts-edpm-deployment" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.178047 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41f0818-52ad-4c25-82fa-61a14a9825a1" containerName="ssh-known-hosts-edpm-deployment" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.178958 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.181581 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.181846 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.182085 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.185715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.198594 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw"] Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.258242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xktl\" (UniqueName: \"kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.258563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.258827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.360812 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.360944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.361003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xktl\" (UniqueName: \"kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.365500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.366245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.378934 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xktl\" (UniqueName: \"kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f6dtw\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:20 crc kubenswrapper[4743]: I0122 14:18:20.514758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:21 crc kubenswrapper[4743]: I0122 14:18:21.028516 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw"] Jan 22 14:18:21 crc kubenswrapper[4743]: W0122 14:18:21.040511 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62ef8bcc_609a_4fe6_a41d_48200e08b72f.slice/crio-a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93 WatchSource:0}: Error finding container a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93: Status 404 returned error can't find the container with id a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93 Jan 22 14:18:21 crc kubenswrapper[4743]: I0122 14:18:21.072415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" event={"ID":"62ef8bcc-609a-4fe6-a41d-48200e08b72f","Type":"ContainerStarted","Data":"a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93"} Jan 22 14:18:22 crc kubenswrapper[4743]: I0122 14:18:22.084251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" event={"ID":"62ef8bcc-609a-4fe6-a41d-48200e08b72f","Type":"ContainerStarted","Data":"7683e04d78c3e49a9ba5ebf51c38cbe6668bac588c0d01123f20fa2d923fd8cb"} Jan 22 14:18:22 crc kubenswrapper[4743]: I0122 14:18:22.108433 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" podStartSLOduration=1.6953836450000002 podStartE2EDuration="2.108411405s" podCreationTimestamp="2026-01-22 14:18:20 +0000 UTC" firstStartedPulling="2026-01-22 14:18:21.049550413 +0000 UTC m=+1937.604593566" lastFinishedPulling="2026-01-22 14:18:21.462578163 +0000 UTC m=+1938.017621326" observedRunningTime="2026-01-22 14:18:22.103076971 +0000 UTC m=+1938.658120134" watchObservedRunningTime="2026-01-22 14:18:22.108411405 +0000 UTC m=+1938.663454558" Jan 22 14:18:28 crc kubenswrapper[4743]: I0122 14:18:28.887754 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:28 crc kubenswrapper[4743]: I0122 14:18:28.949771 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:29 crc kubenswrapper[4743]: I0122 14:18:29.127703 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.049296 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.049377 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.147426 4743 generic.go:334] "Generic (PLEG): container finished" podID="62ef8bcc-609a-4fe6-a41d-48200e08b72f" containerID="7683e04d78c3e49a9ba5ebf51c38cbe6668bac588c0d01123f20fa2d923fd8cb" exitCode=0 Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.147572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" event={"ID":"62ef8bcc-609a-4fe6-a41d-48200e08b72f","Type":"ContainerDied","Data":"7683e04d78c3e49a9ba5ebf51c38cbe6668bac588c0d01123f20fa2d923fd8cb"} Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.147659 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7x9ml" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="registry-server" containerID="cri-o://b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532" gracePeriod=2 Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.555464 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.644422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content\") pod \"bf12d346-55c8-4cac-869e-e0be20f70c96\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.644478 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664v4\" (UniqueName: \"kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4\") pod \"bf12d346-55c8-4cac-869e-e0be20f70c96\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.644507 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities\") pod \"bf12d346-55c8-4cac-869e-e0be20f70c96\" (UID: \"bf12d346-55c8-4cac-869e-e0be20f70c96\") " Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.646056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities" (OuterVolumeSpecName: "utilities") pod "bf12d346-55c8-4cac-869e-e0be20f70c96" (UID: "bf12d346-55c8-4cac-869e-e0be20f70c96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.650220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4" (OuterVolumeSpecName: "kube-api-access-664v4") pod "bf12d346-55c8-4cac-869e-e0be20f70c96" (UID: "bf12d346-55c8-4cac-869e-e0be20f70c96"). InnerVolumeSpecName "kube-api-access-664v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.748055 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-664v4\" (UniqueName: \"kubernetes.io/projected/bf12d346-55c8-4cac-869e-e0be20f70c96-kube-api-access-664v4\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.748088 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.754934 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf12d346-55c8-4cac-869e-e0be20f70c96" (UID: "bf12d346-55c8-4cac-869e-e0be20f70c96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:30 crc kubenswrapper[4743]: I0122 14:18:30.850561 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf12d346-55c8-4cac-869e-e0be20f70c96-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.158963 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerID="b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532" exitCode=0 Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.159011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerDied","Data":"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532"} Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.159069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x9ml" event={"ID":"bf12d346-55c8-4cac-869e-e0be20f70c96","Type":"ContainerDied","Data":"ab8b9069204cdfc6b0b41d43a6b47d88b9fc63c06d6a9f324f0866fca123aebf"} Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.159071 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x9ml" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.159088 4743 scope.go:117] "RemoveContainer" containerID="b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.195869 4743 scope.go:117] "RemoveContainer" containerID="da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.213731 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.224208 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7x9ml"] Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.238951 4743 scope.go:117] "RemoveContainer" containerID="39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.270192 4743 scope.go:117] "RemoveContainer" containerID="b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532" Jan 22 14:18:31 crc kubenswrapper[4743]: E0122 14:18:31.270650 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532\": container with ID starting with b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532 not found: ID does not exist" containerID="b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.270687 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532"} err="failed to get container status \"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532\": rpc error: code = NotFound desc = could not find container \"b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532\": container with ID starting with b4e62c28220e0d0da367665c84df454c28a0465a7c92ec48bd7a03c941d7a532 not found: ID does not exist" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.270716 4743 scope.go:117] "RemoveContainer" containerID="da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183" Jan 22 14:18:31 crc kubenswrapper[4743]: E0122 14:18:31.271096 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183\": container with ID starting with da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183 not found: ID does not exist" containerID="da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.271148 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183"} err="failed to get container status \"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183\": rpc error: code = NotFound desc = could not find container \"da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183\": container with ID starting with da268f2da1b0bc155a34eeb0d906b296ed8b51df20272c838b716b237063b183 not found: ID does not exist" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.271183 4743 scope.go:117] "RemoveContainer" containerID="39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39" Jan 22 14:18:31 crc kubenswrapper[4743]: E0122 14:18:31.271617 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39\": container with ID starting with 39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39 not found: ID does not exist" containerID="39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.271651 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39"} err="failed to get container status \"39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39\": rpc error: code = NotFound desc = could not find container \"39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39\": container with ID starting with 39a570001422ce0199289942aa74058fdb42a7285fb928fd534be3f9840e6f39 not found: ID does not exist" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.557481 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.665199 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory\") pod \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.665688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam\") pod \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.665839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xktl\" (UniqueName: \"kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl\") pod \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\" (UID: \"62ef8bcc-609a-4fe6-a41d-48200e08b72f\") " Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.675084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl" (OuterVolumeSpecName: "kube-api-access-9xktl") pod "62ef8bcc-609a-4fe6-a41d-48200e08b72f" (UID: "62ef8bcc-609a-4fe6-a41d-48200e08b72f"). InnerVolumeSpecName "kube-api-access-9xktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.702037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory" (OuterVolumeSpecName: "inventory") pod "62ef8bcc-609a-4fe6-a41d-48200e08b72f" (UID: "62ef8bcc-609a-4fe6-a41d-48200e08b72f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.708236 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62ef8bcc-609a-4fe6-a41d-48200e08b72f" (UID: "62ef8bcc-609a-4fe6-a41d-48200e08b72f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.759725 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" path="/var/lib/kubelet/pods/bf12d346-55c8-4cac-869e-e0be20f70c96/volumes" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.767695 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.767895 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xktl\" (UniqueName: \"kubernetes.io/projected/62ef8bcc-609a-4fe6-a41d-48200e08b72f-kube-api-access-9xktl\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:31 crc kubenswrapper[4743]: I0122 14:18:31.767978 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62ef8bcc-609a-4fe6-a41d-48200e08b72f-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.175529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" event={"ID":"62ef8bcc-609a-4fe6-a41d-48200e08b72f","Type":"ContainerDied","Data":"a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93"} Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.175556 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f6dtw" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.175580 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b3d8004f56566be90ba8c27708e89650837952f6c338f5b59e1fd424f61b93" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.246574 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp"] Jan 22 14:18:32 crc kubenswrapper[4743]: E0122 14:18:32.247030 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ef8bcc-609a-4fe6-a41d-48200e08b72f" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247051 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ef8bcc-609a-4fe6-a41d-48200e08b72f" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:32 crc kubenswrapper[4743]: E0122 14:18:32.247093 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="extract-content" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247102 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="extract-content" Jan 22 14:18:32 crc kubenswrapper[4743]: E0122 14:18:32.247118 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="extract-utilities" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247126 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="extract-utilities" Jan 22 14:18:32 crc kubenswrapper[4743]: E0122 14:18:32.247147 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="registry-server" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247154 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="registry-server" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247372 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ef8bcc-609a-4fe6-a41d-48200e08b72f" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.247387 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf12d346-55c8-4cac-869e-e0be20f70c96" containerName="registry-server" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.248272 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.250555 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.253763 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.253762 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.254119 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.260129 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp"] Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.276341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.276493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vmfr\" (UniqueName: \"kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.276600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.379549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.380093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vmfr\" (UniqueName: \"kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.380198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.384167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.384634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.395495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vmfr\" (UniqueName: \"kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:32 crc kubenswrapper[4743]: I0122 14:18:32.573279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:33 crc kubenswrapper[4743]: I0122 14:18:33.065780 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp"] Jan 22 14:18:33 crc kubenswrapper[4743]: I0122 14:18:33.185683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" event={"ID":"010d8c84-1843-4e5c-85b8-b39df20a58fd","Type":"ContainerStarted","Data":"f54e72b6d2accbb53b39c3b81b7550885345821dbc80de1bd59905396ce106d7"} Jan 22 14:18:34 crc kubenswrapper[4743]: I0122 14:18:34.194021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" event={"ID":"010d8c84-1843-4e5c-85b8-b39df20a58fd","Type":"ContainerStarted","Data":"e304ce05b66fa87356e78cc454b6f80d3eb2a78405ecde5d0522ebcbd3bedd08"} Jan 22 14:18:43 crc kubenswrapper[4743]: I0122 14:18:43.274282 4743 generic.go:334] "Generic (PLEG): container finished" podID="010d8c84-1843-4e5c-85b8-b39df20a58fd" containerID="e304ce05b66fa87356e78cc454b6f80d3eb2a78405ecde5d0522ebcbd3bedd08" exitCode=0 Jan 22 14:18:43 crc kubenswrapper[4743]: I0122 14:18:43.274383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" event={"ID":"010d8c84-1843-4e5c-85b8-b39df20a58fd","Type":"ContainerDied","Data":"e304ce05b66fa87356e78cc454b6f80d3eb2a78405ecde5d0522ebcbd3bedd08"} Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.607720 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.715253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory\") pod \"010d8c84-1843-4e5c-85b8-b39df20a58fd\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.715391 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vmfr\" (UniqueName: \"kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr\") pod \"010d8c84-1843-4e5c-85b8-b39df20a58fd\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.715514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam\") pod \"010d8c84-1843-4e5c-85b8-b39df20a58fd\" (UID: \"010d8c84-1843-4e5c-85b8-b39df20a58fd\") " Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.726396 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr" (OuterVolumeSpecName: "kube-api-access-5vmfr") pod "010d8c84-1843-4e5c-85b8-b39df20a58fd" (UID: "010d8c84-1843-4e5c-85b8-b39df20a58fd"). InnerVolumeSpecName "kube-api-access-5vmfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.746087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory" (OuterVolumeSpecName: "inventory") pod "010d8c84-1843-4e5c-85b8-b39df20a58fd" (UID: "010d8c84-1843-4e5c-85b8-b39df20a58fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.753827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "010d8c84-1843-4e5c-85b8-b39df20a58fd" (UID: "010d8c84-1843-4e5c-85b8-b39df20a58fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.819021 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.819055 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/010d8c84-1843-4e5c-85b8-b39df20a58fd-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:44 crc kubenswrapper[4743]: I0122 14:18:44.819065 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vmfr\" (UniqueName: \"kubernetes.io/projected/010d8c84-1843-4e5c-85b8-b39df20a58fd-kube-api-access-5vmfr\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.294064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" event={"ID":"010d8c84-1843-4e5c-85b8-b39df20a58fd","Type":"ContainerDied","Data":"f54e72b6d2accbb53b39c3b81b7550885345821dbc80de1bd59905396ce106d7"} Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.294107 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54e72b6d2accbb53b39c3b81b7550885345821dbc80de1bd59905396ce106d7" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.294167 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.379698 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt"] Jan 22 14:18:45 crc kubenswrapper[4743]: E0122 14:18:45.380187 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="010d8c84-1843-4e5c-85b8-b39df20a58fd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.380207 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="010d8c84-1843-4e5c-85b8-b39df20a58fd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.380450 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="010d8c84-1843-4e5c-85b8-b39df20a58fd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.381151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.383566 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.383718 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.383880 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.384085 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.384218 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.384381 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.385353 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.385993 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.394535 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt"] Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428209 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc99m\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.428994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.530908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.530957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.530977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531111 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc99m\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531214 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531312 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.531399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.538148 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.538595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.538710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.538771 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.539185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.539197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.539531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.540861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.541061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.541200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.543261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.552477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.553431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc99m\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.554318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:45 crc kubenswrapper[4743]: I0122 14:18:45.699547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:18:46 crc kubenswrapper[4743]: W0122 14:18:46.209478 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeb6ea36_21b3_4658_a609_7ecace4d6efc.slice/crio-c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749 WatchSource:0}: Error finding container c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749: Status 404 returned error can't find the container with id c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749 Jan 22 14:18:46 crc kubenswrapper[4743]: I0122 14:18:46.213425 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt"] Jan 22 14:18:46 crc kubenswrapper[4743]: I0122 14:18:46.304235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" event={"ID":"beb6ea36-21b3-4658-a609-7ecace4d6efc","Type":"ContainerStarted","Data":"c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749"} Jan 22 14:18:47 crc kubenswrapper[4743]: I0122 14:18:47.313843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" event={"ID":"beb6ea36-21b3-4658-a609-7ecace4d6efc","Type":"ContainerStarted","Data":"c3e201cde211edede2550bd7d0bf53cb72bdad0009efd947d7bddeb1b3ab885b"} Jan 22 14:18:47 crc kubenswrapper[4743]: I0122 14:18:47.337681 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" podStartSLOduration=1.846372193 podStartE2EDuration="2.337657841s" podCreationTimestamp="2026-01-22 14:18:45 +0000 UTC" firstStartedPulling="2026-01-22 14:18:46.212039921 +0000 UTC m=+1962.767083084" lastFinishedPulling="2026-01-22 14:18:46.703325569 +0000 UTC m=+1963.258368732" observedRunningTime="2026-01-22 14:18:47.329214573 +0000 UTC m=+1963.884257736" watchObservedRunningTime="2026-01-22 14:18:47.337657841 +0000 UTC m=+1963.892701004" Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.048748 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.049344 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.049406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.050261 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.050330 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7" gracePeriod=600 Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.459363 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7" exitCode=0 Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.459464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7"} Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.460028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0"} Jan 22 14:19:00 crc kubenswrapper[4743]: I0122 14:19:00.460066 4743 scope.go:117] "RemoveContainer" containerID="1e51a1e4567f044d1ba62e15ca7baaab6948208e9d675c95afad01e5e442ca14" Jan 22 14:19:25 crc kubenswrapper[4743]: I0122 14:19:25.658507 4743 generic.go:334] "Generic (PLEG): container finished" podID="beb6ea36-21b3-4658-a609-7ecace4d6efc" containerID="c3e201cde211edede2550bd7d0bf53cb72bdad0009efd947d7bddeb1b3ab885b" exitCode=0 Jan 22 14:19:25 crc kubenswrapper[4743]: I0122 14:19:25.658555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" event={"ID":"beb6ea36-21b3-4658-a609-7ecace4d6efc","Type":"ContainerDied","Data":"c3e201cde211edede2550bd7d0bf53cb72bdad0009efd947d7bddeb1b3ab885b"} Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.052469 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc99m\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172885 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.172980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173234 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.173318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"beb6ea36-21b3-4658-a609-7ecace4d6efc\" (UID: \"beb6ea36-21b3-4658-a609-7ecace4d6efc\") " Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.179899 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.180000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m" (OuterVolumeSpecName: "kube-api-access-cc99m") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "kube-api-access-cc99m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.180524 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.180678 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.181509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.181521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.182267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.182453 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.182702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.187238 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.192004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.195001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.207169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory" (OuterVolumeSpecName: "inventory") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.216191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "beb6ea36-21b3-4658-a609-7ecace4d6efc" (UID: "beb6ea36-21b3-4658-a609-7ecace4d6efc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277395 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277738 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277757 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277775 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277809 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277824 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277837 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc99m\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-kube-api-access-cc99m\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277851 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277864 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277876 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277889 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/beb6ea36-21b3-4658-a609-7ecace4d6efc-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277900 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277913 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.277927 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb6ea36-21b3-4658-a609-7ecace4d6efc-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.679084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" event={"ID":"beb6ea36-21b3-4658-a609-7ecace4d6efc","Type":"ContainerDied","Data":"c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749"} Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.679132 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04337536b137174ab3773d84dd9f7aef450b0583ac1dfaaddc6ceca5e7af749" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.679177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.826235 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h"] Jan 22 14:19:27 crc kubenswrapper[4743]: E0122 14:19:27.826690 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb6ea36-21b3-4658-a609-7ecace4d6efc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.826718 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb6ea36-21b3-4658-a609-7ecace4d6efc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.826970 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb6ea36-21b3-4658-a609-7ecace4d6efc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.827668 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.838377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h"] Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.862467 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.862478 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.862652 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.862887 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.866685 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.887851 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.887922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.887954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.888240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsqb\" (UniqueName: \"kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.888318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.990658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsqb\" (UniqueName: \"kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.990746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.990844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.990897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.990924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.994348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.996690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:27 crc kubenswrapper[4743]: I0122 14:19:27.999431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:28 crc kubenswrapper[4743]: I0122 14:19:28.002982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:28 crc kubenswrapper[4743]: I0122 14:19:28.015691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsqb\" (UniqueName: \"kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nm84h\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:28 crc kubenswrapper[4743]: I0122 14:19:28.179384 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:19:28 crc kubenswrapper[4743]: I0122 14:19:28.726313 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h"] Jan 22 14:19:29 crc kubenswrapper[4743]: I0122 14:19:29.707908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" event={"ID":"99d2edf2-043a-4066-9d64-36be28d2197d","Type":"ContainerStarted","Data":"b134528a0d1cab07b9c5be8f88b7f8d14360d7a55b166264f7a32380a6af008d"} Jan 22 14:19:29 crc kubenswrapper[4743]: I0122 14:19:29.709274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" event={"ID":"99d2edf2-043a-4066-9d64-36be28d2197d","Type":"ContainerStarted","Data":"b0b857725171c356b1397901de18b467b4033493e87d95c2f22e93098ad522cd"} Jan 22 14:19:29 crc kubenswrapper[4743]: I0122 14:19:29.737656 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" podStartSLOduration=2.288286738 podStartE2EDuration="2.737631206s" podCreationTimestamp="2026-01-22 14:19:27 +0000 UTC" firstStartedPulling="2026-01-22 14:19:28.732012529 +0000 UTC m=+2005.287055692" lastFinishedPulling="2026-01-22 14:19:29.181356997 +0000 UTC m=+2005.736400160" observedRunningTime="2026-01-22 14:19:29.728175822 +0000 UTC m=+2006.283218995" watchObservedRunningTime="2026-01-22 14:19:29.737631206 +0000 UTC m=+2006.292674369" Jan 22 14:20:33 crc kubenswrapper[4743]: I0122 14:20:33.243715 4743 generic.go:334] "Generic (PLEG): container finished" podID="99d2edf2-043a-4066-9d64-36be28d2197d" containerID="b134528a0d1cab07b9c5be8f88b7f8d14360d7a55b166264f7a32380a6af008d" exitCode=0 Jan 22 14:20:33 crc kubenswrapper[4743]: I0122 14:20:33.243815 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" event={"ID":"99d2edf2-043a-4066-9d64-36be28d2197d","Type":"ContainerDied","Data":"b134528a0d1cab07b9c5be8f88b7f8d14360d7a55b166264f7a32380a6af008d"} Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.715829 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.753756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsqb\" (UniqueName: \"kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb\") pod \"99d2edf2-043a-4066-9d64-36be28d2197d\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.754042 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0\") pod \"99d2edf2-043a-4066-9d64-36be28d2197d\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.754152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory\") pod \"99d2edf2-043a-4066-9d64-36be28d2197d\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.754673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle\") pod \"99d2edf2-043a-4066-9d64-36be28d2197d\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.754770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam\") pod \"99d2edf2-043a-4066-9d64-36be28d2197d\" (UID: \"99d2edf2-043a-4066-9d64-36be28d2197d\") " Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.765062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "99d2edf2-043a-4066-9d64-36be28d2197d" (UID: "99d2edf2-043a-4066-9d64-36be28d2197d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.772155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb" (OuterVolumeSpecName: "kube-api-access-7qsqb") pod "99d2edf2-043a-4066-9d64-36be28d2197d" (UID: "99d2edf2-043a-4066-9d64-36be28d2197d"). InnerVolumeSpecName "kube-api-access-7qsqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.790980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory" (OuterVolumeSpecName: "inventory") pod "99d2edf2-043a-4066-9d64-36be28d2197d" (UID: "99d2edf2-043a-4066-9d64-36be28d2197d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.800586 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "99d2edf2-043a-4066-9d64-36be28d2197d" (UID: "99d2edf2-043a-4066-9d64-36be28d2197d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.807776 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "99d2edf2-043a-4066-9d64-36be28d2197d" (UID: "99d2edf2-043a-4066-9d64-36be28d2197d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.857055 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99d2edf2-043a-4066-9d64-36be28d2197d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.857301 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.857410 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.857516 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99d2edf2-043a-4066-9d64-36be28d2197d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:20:34 crc kubenswrapper[4743]: I0122 14:20:34.857653 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qsqb\" (UniqueName: \"kubernetes.io/projected/99d2edf2-043a-4066-9d64-36be28d2197d-kube-api-access-7qsqb\") on node \"crc\" DevicePath \"\"" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.260134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" event={"ID":"99d2edf2-043a-4066-9d64-36be28d2197d","Type":"ContainerDied","Data":"b0b857725171c356b1397901de18b467b4033493e87d95c2f22e93098ad522cd"} Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.260446 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b857725171c356b1397901de18b467b4033493e87d95c2f22e93098ad522cd" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.260211 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nm84h" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.358293 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9"] Jan 22 14:20:35 crc kubenswrapper[4743]: E0122 14:20:35.358708 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d2edf2-043a-4066-9d64-36be28d2197d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.358733 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d2edf2-043a-4066-9d64-36be28d2197d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.359027 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d2edf2-043a-4066-9d64-36be28d2197d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.359811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.362192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.362194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.363596 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.364651 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.364885 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.365050 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.377181 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9"] Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.467833 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.467968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.468004 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.468034 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.468050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w9gt\" (UniqueName: \"kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.468072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570071 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w9gt\" (UniqueName: \"kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.570925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.574130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.574157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.574329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.574371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.575349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.590023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w9gt\" (UniqueName: \"kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:35 crc kubenswrapper[4743]: I0122 14:20:35.683880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:20:36 crc kubenswrapper[4743]: I0122 14:20:36.234186 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9"] Jan 22 14:20:36 crc kubenswrapper[4743]: I0122 14:20:36.269429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" event={"ID":"eec02bb6-2380-4910-8e5a-1fe3196760a4","Type":"ContainerStarted","Data":"2fdf6ace0ae28f5dcd254f8254651369b4f6a9263191f2f841d5a92e294fbf22"} Jan 22 14:20:37 crc kubenswrapper[4743]: I0122 14:20:37.278089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" event={"ID":"eec02bb6-2380-4910-8e5a-1fe3196760a4","Type":"ContainerStarted","Data":"cd01003ad8f580cf879e7079f978d77b1c917d474dc50c1d3902b5cb5f859ca8"} Jan 22 14:20:37 crc kubenswrapper[4743]: I0122 14:20:37.298853 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" podStartSLOduration=1.7991077789999999 podStartE2EDuration="2.298835705s" podCreationTimestamp="2026-01-22 14:20:35 +0000 UTC" firstStartedPulling="2026-01-22 14:20:36.234193657 +0000 UTC m=+2072.789236820" lastFinishedPulling="2026-01-22 14:20:36.733921583 +0000 UTC m=+2073.288964746" observedRunningTime="2026-01-22 14:20:37.298118736 +0000 UTC m=+2073.853161899" watchObservedRunningTime="2026-01-22 14:20:37.298835705 +0000 UTC m=+2073.853878858" Jan 22 14:21:00 crc kubenswrapper[4743]: I0122 14:21:00.050298 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:21:00 crc kubenswrapper[4743]: I0122 14:21:00.050891 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:21:24 crc kubenswrapper[4743]: I0122 14:21:24.653475 4743 generic.go:334] "Generic (PLEG): container finished" podID="eec02bb6-2380-4910-8e5a-1fe3196760a4" containerID="cd01003ad8f580cf879e7079f978d77b1c917d474dc50c1d3902b5cb5f859ca8" exitCode=0 Jan 22 14:21:24 crc kubenswrapper[4743]: I0122 14:21:24.653584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" event={"ID":"eec02bb6-2380-4910-8e5a-1fe3196760a4","Type":"ContainerDied","Data":"cd01003ad8f580cf879e7079f978d77b1c917d474dc50c1d3902b5cb5f859ca8"} Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.068578 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.293379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.293606 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.294439 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.294480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.294506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w9gt\" (UniqueName: \"kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.294554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle\") pod \"eec02bb6-2380-4910-8e5a-1fe3196760a4\" (UID: \"eec02bb6-2380-4910-8e5a-1fe3196760a4\") " Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.299959 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt" (OuterVolumeSpecName: "kube-api-access-4w9gt") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "kube-api-access-4w9gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.299957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.323736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.324447 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.327040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.328325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory" (OuterVolumeSpecName: "inventory") pod "eec02bb6-2380-4910-8e5a-1fe3196760a4" (UID: "eec02bb6-2380-4910-8e5a-1fe3196760a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396037 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396075 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396092 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396104 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396117 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eec02bb6-2380-4910-8e5a-1fe3196760a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.396129 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w9gt\" (UniqueName: \"kubernetes.io/projected/eec02bb6-2380-4910-8e5a-1fe3196760a4-kube-api-access-4w9gt\") on node \"crc\" DevicePath \"\"" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.672110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" event={"ID":"eec02bb6-2380-4910-8e5a-1fe3196760a4","Type":"ContainerDied","Data":"2fdf6ace0ae28f5dcd254f8254651369b4f6a9263191f2f841d5a92e294fbf22"} Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.672160 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.672161 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fdf6ace0ae28f5dcd254f8254651369b4f6a9263191f2f841d5a92e294fbf22" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.760408 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2"] Jan 22 14:21:26 crc kubenswrapper[4743]: E0122 14:21:26.760975 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec02bb6-2380-4910-8e5a-1fe3196760a4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.760997 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec02bb6-2380-4910-8e5a-1fe3196760a4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.761215 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec02bb6-2380-4910-8e5a-1fe3196760a4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.762275 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.765203 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.765203 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.765383 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.765220 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.765272 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.772130 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2"] Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.802178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.802273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.802358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.802507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.802634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snfll\" (UniqueName: \"kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.903996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.904087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.904178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snfll\" (UniqueName: \"kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.904224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.904290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.907861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.907870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.909204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.911781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:26 crc kubenswrapper[4743]: I0122 14:21:26.923059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snfll\" (UniqueName: \"kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:27 crc kubenswrapper[4743]: I0122 14:21:27.087286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:21:27 crc kubenswrapper[4743]: I0122 14:21:27.625875 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2"] Jan 22 14:21:27 crc kubenswrapper[4743]: I0122 14:21:27.682743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" event={"ID":"5dca488a-cb84-4610-bf38-0f4c65c8b94a","Type":"ContainerStarted","Data":"0f54885718f93b43630eea9996966d0126aa2be7e233f2609ffb71a50619132e"} Jan 22 14:21:30 crc kubenswrapper[4743]: I0122 14:21:30.049988 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:21:30 crc kubenswrapper[4743]: I0122 14:21:30.050613 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:21:30 crc kubenswrapper[4743]: I0122 14:21:30.707745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" event={"ID":"5dca488a-cb84-4610-bf38-0f4c65c8b94a","Type":"ContainerStarted","Data":"d16199c83335fa387fbe17092549028627c56453a30de888775cce2588853a44"} Jan 22 14:21:30 crc kubenswrapper[4743]: I0122 14:21:30.732940 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" podStartSLOduration=2.78994628 podStartE2EDuration="4.732921745s" podCreationTimestamp="2026-01-22 14:21:26 +0000 UTC" firstStartedPulling="2026-01-22 14:21:27.630835487 +0000 UTC m=+2124.185878650" lastFinishedPulling="2026-01-22 14:21:29.573810952 +0000 UTC m=+2126.128854115" observedRunningTime="2026-01-22 14:21:30.727930831 +0000 UTC m=+2127.282974004" watchObservedRunningTime="2026-01-22 14:21:30.732921745 +0000 UTC m=+2127.287964908" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.123924 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.126820 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.144070 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.226954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x58m\" (UniqueName: \"kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.227100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.227277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.329254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.329425 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x58m\" (UniqueName: \"kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.329911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.330195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.330242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.350055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x58m\" (UniqueName: \"kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m\") pod \"redhat-marketplace-krmwt\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.461249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.750601 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:21:52 crc kubenswrapper[4743]: I0122 14:21:52.896421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerStarted","Data":"7e420617eb6e00797c7b95cbe027e4a6783eb06d082255ee5832591e899de81a"} Jan 22 14:21:53 crc kubenswrapper[4743]: I0122 14:21:53.905309 4743 generic.go:334] "Generic (PLEG): container finished" podID="e794a161-07d9-4611-a5c8-16920f433da5" containerID="b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f" exitCode=0 Jan 22 14:21:53 crc kubenswrapper[4743]: I0122 14:21:53.905614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerDied","Data":"b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f"} Jan 22 14:21:54 crc kubenswrapper[4743]: I0122 14:21:54.916170 4743 generic.go:334] "Generic (PLEG): container finished" podID="e794a161-07d9-4611-a5c8-16920f433da5" containerID="e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f" exitCode=0 Jan 22 14:21:54 crc kubenswrapper[4743]: I0122 14:21:54.916273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerDied","Data":"e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f"} Jan 22 14:21:56 crc kubenswrapper[4743]: I0122 14:21:56.936029 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerStarted","Data":"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb"} Jan 22 14:21:56 crc kubenswrapper[4743]: I0122 14:21:56.955981 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krmwt" podStartSLOduration=3.023546864 podStartE2EDuration="4.955953432s" podCreationTimestamp="2026-01-22 14:21:52 +0000 UTC" firstStartedPulling="2026-01-22 14:21:53.907538122 +0000 UTC m=+2150.462581285" lastFinishedPulling="2026-01-22 14:21:55.83994469 +0000 UTC m=+2152.394987853" observedRunningTime="2026-01-22 14:21:56.954070671 +0000 UTC m=+2153.509113844" watchObservedRunningTime="2026-01-22 14:21:56.955953432 +0000 UTC m=+2153.510996615" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.049247 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.049801 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.049851 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.050556 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.050613 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" gracePeriod=600 Jan 22 14:22:00 crc kubenswrapper[4743]: E0122 14:22:00.682591 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.966165 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" exitCode=0 Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.966223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0"} Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.966319 4743 scope.go:117] "RemoveContainer" containerID="1af7ecf77f9c044f696a147946738a6ed62f5bd006bf111f8137da5ced5ddcc7" Jan 22 14:22:00 crc kubenswrapper[4743]: I0122 14:22:00.968661 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:22:00 crc kubenswrapper[4743]: E0122 14:22:00.969088 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:22:02 crc kubenswrapper[4743]: I0122 14:22:02.462967 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:02 crc kubenswrapper[4743]: I0122 14:22:02.463423 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:02 crc kubenswrapper[4743]: I0122 14:22:02.519035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:03 crc kubenswrapper[4743]: I0122 14:22:03.025743 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:03 crc kubenswrapper[4743]: I0122 14:22:03.077287 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:22:04 crc kubenswrapper[4743]: I0122 14:22:04.998608 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krmwt" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="registry-server" containerID="cri-o://4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb" gracePeriod=2 Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.939235 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.991681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x58m\" (UniqueName: \"kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m\") pod \"e794a161-07d9-4611-a5c8-16920f433da5\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.991779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content\") pod \"e794a161-07d9-4611-a5c8-16920f433da5\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.991884 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities\") pod \"e794a161-07d9-4611-a5c8-16920f433da5\" (UID: \"e794a161-07d9-4611-a5c8-16920f433da5\") " Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.992759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities" (OuterVolumeSpecName: "utilities") pod "e794a161-07d9-4611-a5c8-16920f433da5" (UID: "e794a161-07d9-4611-a5c8-16920f433da5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:22:05 crc kubenswrapper[4743]: I0122 14:22:05.998420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m" (OuterVolumeSpecName: "kube-api-access-9x58m") pod "e794a161-07d9-4611-a5c8-16920f433da5" (UID: "e794a161-07d9-4611-a5c8-16920f433da5"). InnerVolumeSpecName "kube-api-access-9x58m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.012012 4743 generic.go:334] "Generic (PLEG): container finished" podID="e794a161-07d9-4611-a5c8-16920f433da5" containerID="4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb" exitCode=0 Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.012068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerDied","Data":"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb"} Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.012134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krmwt" event={"ID":"e794a161-07d9-4611-a5c8-16920f433da5","Type":"ContainerDied","Data":"7e420617eb6e00797c7b95cbe027e4a6783eb06d082255ee5832591e899de81a"} Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.012143 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krmwt" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.012159 4743 scope.go:117] "RemoveContainer" containerID="4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.018935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e794a161-07d9-4611-a5c8-16920f433da5" (UID: "e794a161-07d9-4611-a5c8-16920f433da5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.066948 4743 scope.go:117] "RemoveContainer" containerID="e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.087644 4743 scope.go:117] "RemoveContainer" containerID="b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.094720 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x58m\" (UniqueName: \"kubernetes.io/projected/e794a161-07d9-4611-a5c8-16920f433da5-kube-api-access-9x58m\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.094810 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.094828 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e794a161-07d9-4611-a5c8-16920f433da5-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.129947 4743 scope.go:117] "RemoveContainer" containerID="4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb" Jan 22 14:22:06 crc kubenswrapper[4743]: E0122 14:22:06.130516 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb\": container with ID starting with 4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb not found: ID does not exist" containerID="4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.130565 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb"} err="failed to get container status \"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb\": rpc error: code = NotFound desc = could not find container \"4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb\": container with ID starting with 4891cff7c7123a0d4396a8b32cfd60a348175375fa984bbfa96fe73eca3586eb not found: ID does not exist" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.130593 4743 scope.go:117] "RemoveContainer" containerID="e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f" Jan 22 14:22:06 crc kubenswrapper[4743]: E0122 14:22:06.132454 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f\": container with ID starting with e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f not found: ID does not exist" containerID="e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.132494 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f"} err="failed to get container status \"e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f\": rpc error: code = NotFound desc = could not find container \"e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f\": container with ID starting with e124f4c8c4948dcc12996d91cde414e046d668ab18da0832aa396d3cf54c660f not found: ID does not exist" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.132526 4743 scope.go:117] "RemoveContainer" containerID="b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f" Jan 22 14:22:06 crc kubenswrapper[4743]: E0122 14:22:06.133070 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f\": container with ID starting with b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f not found: ID does not exist" containerID="b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.133122 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f"} err="failed to get container status \"b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f\": rpc error: code = NotFound desc = could not find container \"b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f\": container with ID starting with b6d4cfd78665131d6118595113cf3dbcd37dc8442c5f62fe0184575b6347874f not found: ID does not exist" Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.348339 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:22:06 crc kubenswrapper[4743]: I0122 14:22:06.355350 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krmwt"] Jan 22 14:22:07 crc kubenswrapper[4743]: I0122 14:22:07.760103 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e794a161-07d9-4611-a5c8-16920f433da5" path="/var/lib/kubelet/pods/e794a161-07d9-4611-a5c8-16920f433da5/volumes" Jan 22 14:22:14 crc kubenswrapper[4743]: I0122 14:22:14.747253 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:22:14 crc kubenswrapper[4743]: E0122 14:22:14.748034 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:22:26 crc kubenswrapper[4743]: I0122 14:22:26.747642 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:22:26 crc kubenswrapper[4743]: E0122 14:22:26.748504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.983315 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:29 crc kubenswrapper[4743]: E0122 14:22:29.984046 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="extract-utilities" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.984061 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="extract-utilities" Jan 22 14:22:29 crc kubenswrapper[4743]: E0122 14:22:29.984294 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="extract-content" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.984300 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="extract-content" Jan 22 14:22:29 crc kubenswrapper[4743]: E0122 14:22:29.984312 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="registry-server" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.984318 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="registry-server" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.984504 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e794a161-07d9-4611-a5c8-16920f433da5" containerName="registry-server" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.985781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:29 crc kubenswrapper[4743]: I0122 14:22:29.996733 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.146552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.146627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.147280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w2k\" (UniqueName: \"kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.249599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.249665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.249688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w2k\" (UniqueName: \"kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.250487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.250600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.284168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w2k\" (UniqueName: \"kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k\") pod \"community-operators-8cnxn\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.307712 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:30 crc kubenswrapper[4743]: I0122 14:22:30.833564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:31 crc kubenswrapper[4743]: I0122 14:22:31.245613 4743 generic.go:334] "Generic (PLEG): container finished" podID="d21f5d73-0060-448f-971b-db1da8e12f37" containerID="484f8225ff534325f0ea121477ec3cbb0b00d9dc60403f037be013bef3ccc623" exitCode=0 Jan 22 14:22:31 crc kubenswrapper[4743]: I0122 14:22:31.245683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerDied","Data":"484f8225ff534325f0ea121477ec3cbb0b00d9dc60403f037be013bef3ccc623"} Jan 22 14:22:31 crc kubenswrapper[4743]: I0122 14:22:31.245977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerStarted","Data":"c09987452f81510b883287d75114688b2b11eb4df8028110c6f6d60aa359c601"} Jan 22 14:22:32 crc kubenswrapper[4743]: I0122 14:22:32.255665 4743 generic.go:334] "Generic (PLEG): container finished" podID="d21f5d73-0060-448f-971b-db1da8e12f37" containerID="dfd15a0608bd1f70126230c71ca25ae3c1a08977d5f7c9695777ec6dcbe3f7a3" exitCode=0 Jan 22 14:22:32 crc kubenswrapper[4743]: I0122 14:22:32.256004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerDied","Data":"dfd15a0608bd1f70126230c71ca25ae3c1a08977d5f7c9695777ec6dcbe3f7a3"} Jan 22 14:22:35 crc kubenswrapper[4743]: I0122 14:22:35.295618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerStarted","Data":"0504aed01eaa185622eda95d662c830010176d616c83b0dcce3bd5312669e5f1"} Jan 22 14:22:35 crc kubenswrapper[4743]: I0122 14:22:35.313217 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8cnxn" podStartSLOduration=3.401011339 podStartE2EDuration="6.313197699s" podCreationTimestamp="2026-01-22 14:22:29 +0000 UTC" firstStartedPulling="2026-01-22 14:22:31.247714622 +0000 UTC m=+2187.802757785" lastFinishedPulling="2026-01-22 14:22:34.159900982 +0000 UTC m=+2190.714944145" observedRunningTime="2026-01-22 14:22:35.312403478 +0000 UTC m=+2191.867446651" watchObservedRunningTime="2026-01-22 14:22:35.313197699 +0000 UTC m=+2191.868240862" Jan 22 14:22:37 crc kubenswrapper[4743]: I0122 14:22:37.748130 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:22:37 crc kubenswrapper[4743]: E0122 14:22:37.749558 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:22:40 crc kubenswrapper[4743]: I0122 14:22:40.308031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:40 crc kubenswrapper[4743]: I0122 14:22:40.308416 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:40 crc kubenswrapper[4743]: I0122 14:22:40.366172 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:40 crc kubenswrapper[4743]: I0122 14:22:40.421471 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:40 crc kubenswrapper[4743]: I0122 14:22:40.610075 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:42 crc kubenswrapper[4743]: I0122 14:22:42.372109 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8cnxn" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="registry-server" containerID="cri-o://0504aed01eaa185622eda95d662c830010176d616c83b0dcce3bd5312669e5f1" gracePeriod=2 Jan 22 14:22:43 crc kubenswrapper[4743]: I0122 14:22:43.383981 4743 generic.go:334] "Generic (PLEG): container finished" podID="d21f5d73-0060-448f-971b-db1da8e12f37" containerID="0504aed01eaa185622eda95d662c830010176d616c83b0dcce3bd5312669e5f1" exitCode=0 Jan 22 14:22:43 crc kubenswrapper[4743]: I0122 14:22:43.384052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerDied","Data":"0504aed01eaa185622eda95d662c830010176d616c83b0dcce3bd5312669e5f1"} Jan 22 14:22:43 crc kubenswrapper[4743]: I0122 14:22:43.893656 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.035777 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8w2k\" (UniqueName: \"kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k\") pod \"d21f5d73-0060-448f-971b-db1da8e12f37\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.035899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content\") pod \"d21f5d73-0060-448f-971b-db1da8e12f37\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.035921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities\") pod \"d21f5d73-0060-448f-971b-db1da8e12f37\" (UID: \"d21f5d73-0060-448f-971b-db1da8e12f37\") " Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.037166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities" (OuterVolumeSpecName: "utilities") pod "d21f5d73-0060-448f-971b-db1da8e12f37" (UID: "d21f5d73-0060-448f-971b-db1da8e12f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.039197 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.045365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k" (OuterVolumeSpecName: "kube-api-access-l8w2k") pod "d21f5d73-0060-448f-971b-db1da8e12f37" (UID: "d21f5d73-0060-448f-971b-db1da8e12f37"). InnerVolumeSpecName "kube-api-access-l8w2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.087192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d21f5d73-0060-448f-971b-db1da8e12f37" (UID: "d21f5d73-0060-448f-971b-db1da8e12f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.142080 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8w2k\" (UniqueName: \"kubernetes.io/projected/d21f5d73-0060-448f-971b-db1da8e12f37-kube-api-access-l8w2k\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.142134 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d21f5d73-0060-448f-971b-db1da8e12f37-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.396302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cnxn" event={"ID":"d21f5d73-0060-448f-971b-db1da8e12f37","Type":"ContainerDied","Data":"c09987452f81510b883287d75114688b2b11eb4df8028110c6f6d60aa359c601"} Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.396343 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cnxn" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.396365 4743 scope.go:117] "RemoveContainer" containerID="0504aed01eaa185622eda95d662c830010176d616c83b0dcce3bd5312669e5f1" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.417318 4743 scope.go:117] "RemoveContainer" containerID="dfd15a0608bd1f70126230c71ca25ae3c1a08977d5f7c9695777ec6dcbe3f7a3" Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.438540 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.447151 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8cnxn"] Jan 22 14:22:44 crc kubenswrapper[4743]: I0122 14:22:44.452282 4743 scope.go:117] "RemoveContainer" containerID="484f8225ff534325f0ea121477ec3cbb0b00d9dc60403f037be013bef3ccc623" Jan 22 14:22:45 crc kubenswrapper[4743]: I0122 14:22:45.760228 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" path="/var/lib/kubelet/pods/d21f5d73-0060-448f-971b-db1da8e12f37/volumes" Jan 22 14:22:52 crc kubenswrapper[4743]: I0122 14:22:52.748012 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:22:52 crc kubenswrapper[4743]: E0122 14:22:52.748686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:23:07 crc kubenswrapper[4743]: I0122 14:23:07.747067 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:23:07 crc kubenswrapper[4743]: E0122 14:23:07.748853 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:23:22 crc kubenswrapper[4743]: I0122 14:23:22.747741 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:23:22 crc kubenswrapper[4743]: E0122 14:23:22.748522 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:23:34 crc kubenswrapper[4743]: I0122 14:23:34.748119 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:23:34 crc kubenswrapper[4743]: E0122 14:23:34.749025 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:23:46 crc kubenswrapper[4743]: I0122 14:23:46.748619 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:23:46 crc kubenswrapper[4743]: E0122 14:23:46.749862 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:23:58 crc kubenswrapper[4743]: I0122 14:23:58.747757 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:23:58 crc kubenswrapper[4743]: E0122 14:23:58.749538 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:24:10 crc kubenswrapper[4743]: I0122 14:24:10.746981 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:24:10 crc kubenswrapper[4743]: E0122 14:24:10.747674 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:24:21 crc kubenswrapper[4743]: I0122 14:24:21.747891 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:24:21 crc kubenswrapper[4743]: E0122 14:24:21.749074 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:24:35 crc kubenswrapper[4743]: I0122 14:24:35.747876 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:24:35 crc kubenswrapper[4743]: E0122 14:24:35.748820 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:24:47 crc kubenswrapper[4743]: I0122 14:24:47.747436 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:24:47 crc kubenswrapper[4743]: E0122 14:24:47.748927 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:01 crc kubenswrapper[4743]: I0122 14:25:01.750291 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:25:01 crc kubenswrapper[4743]: E0122 14:25:01.751267 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:13 crc kubenswrapper[4743]: I0122 14:25:13.753466 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:25:13 crc kubenswrapper[4743]: E0122 14:25:13.754218 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.375121 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:16 crc kubenswrapper[4743]: E0122 14:25:16.375806 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="registry-server" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.375819 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="registry-server" Jan 22 14:25:16 crc kubenswrapper[4743]: E0122 14:25:16.375833 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="extract-content" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.375839 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="extract-content" Jan 22 14:25:16 crc kubenswrapper[4743]: E0122 14:25:16.375883 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="extract-utilities" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.375890 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="extract-utilities" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.376243 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d21f5d73-0060-448f-971b-db1da8e12f37" containerName="registry-server" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.377484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.391253 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.494175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.494229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gbg\" (UniqueName: \"kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.494306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.595903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.595972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gbg\" (UniqueName: \"kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.596115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.596430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.596595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.617708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gbg\" (UniqueName: \"kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg\") pod \"certified-operators-sghr5\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:16 crc kubenswrapper[4743]: I0122 14:25:16.700430 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:17 crc kubenswrapper[4743]: I0122 14:25:17.228651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:17 crc kubenswrapper[4743]: I0122 14:25:17.722115 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerID="7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920" exitCode=0 Jan 22 14:25:17 crc kubenswrapper[4743]: I0122 14:25:17.722211 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerDied","Data":"7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920"} Jan 22 14:25:17 crc kubenswrapper[4743]: I0122 14:25:17.722651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerStarted","Data":"21831e54250423fc24c03636ee61d39a48022a66fa0eea3f14ce190fa40bf793"} Jan 22 14:25:17 crc kubenswrapper[4743]: I0122 14:25:17.724468 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:25:18 crc kubenswrapper[4743]: I0122 14:25:18.731144 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerID="b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a" exitCode=0 Jan 22 14:25:18 crc kubenswrapper[4743]: I0122 14:25:18.731205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerDied","Data":"b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a"} Jan 22 14:25:19 crc kubenswrapper[4743]: I0122 14:25:19.767240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerStarted","Data":"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063"} Jan 22 14:25:19 crc kubenswrapper[4743]: I0122 14:25:19.792680 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sghr5" podStartSLOduration=2.283944035 podStartE2EDuration="3.792656983s" podCreationTimestamp="2026-01-22 14:25:16 +0000 UTC" firstStartedPulling="2026-01-22 14:25:17.724153231 +0000 UTC m=+2354.279196404" lastFinishedPulling="2026-01-22 14:25:19.232866189 +0000 UTC m=+2355.787909352" observedRunningTime="2026-01-22 14:25:19.779043057 +0000 UTC m=+2356.334086280" watchObservedRunningTime="2026-01-22 14:25:19.792656983 +0000 UTC m=+2356.347700146" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.701117 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.701551 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.746354 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.747979 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:25:26 crc kubenswrapper[4743]: E0122 14:25:26.748313 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.871541 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:26 crc kubenswrapper[4743]: I0122 14:25:26.980139 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:28 crc kubenswrapper[4743]: I0122 14:25:28.832245 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sghr5" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="registry-server" containerID="cri-o://103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063" gracePeriod=2 Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.793267 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.842841 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerID="103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063" exitCode=0 Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.842895 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sghr5" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.842916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerDied","Data":"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063"} Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.843293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sghr5" event={"ID":"1d89e3f5-7a04-413e-ad3d-3604f2a44f88","Type":"ContainerDied","Data":"21831e54250423fc24c03636ee61d39a48022a66fa0eea3f14ce190fa40bf793"} Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.843310 4743 scope.go:117] "RemoveContainer" containerID="103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.864249 4743 scope.go:117] "RemoveContainer" containerID="b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.883626 4743 scope.go:117] "RemoveContainer" containerID="7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.929759 4743 scope.go:117] "RemoveContainer" containerID="103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063" Jan 22 14:25:29 crc kubenswrapper[4743]: E0122 14:25:29.930153 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063\": container with ID starting with 103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063 not found: ID does not exist" containerID="103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.930190 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063"} err="failed to get container status \"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063\": rpc error: code = NotFound desc = could not find container \"103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063\": container with ID starting with 103ce90820261d24bc3c67be80be5639487dab0116d70e580d9fc8518f295063 not found: ID does not exist" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.930217 4743 scope.go:117] "RemoveContainer" containerID="b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a" Jan 22 14:25:29 crc kubenswrapper[4743]: E0122 14:25:29.930476 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a\": container with ID starting with b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a not found: ID does not exist" containerID="b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.930502 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a"} err="failed to get container status \"b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a\": rpc error: code = NotFound desc = could not find container \"b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a\": container with ID starting with b6815a4968f9b4493dfd58e5246a847e19d3985bacccd3309fe60f911d65610a not found: ID does not exist" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.930516 4743 scope.go:117] "RemoveContainer" containerID="7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920" Jan 22 14:25:29 crc kubenswrapper[4743]: E0122 14:25:29.930727 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920\": container with ID starting with 7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920 not found: ID does not exist" containerID="7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.930750 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920"} err="failed to get container status \"7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920\": rpc error: code = NotFound desc = could not find container \"7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920\": container with ID starting with 7fd20b580f7519fc53724dcd7cc3ffd74594a54bc83914942d3fd18016518920 not found: ID does not exist" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.964710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gbg\" (UniqueName: \"kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg\") pod \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.964821 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities\") pod \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.964977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content\") pod \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\" (UID: \"1d89e3f5-7a04-413e-ad3d-3604f2a44f88\") " Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.965859 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities" (OuterVolumeSpecName: "utilities") pod "1d89e3f5-7a04-413e-ad3d-3604f2a44f88" (UID: "1d89e3f5-7a04-413e-ad3d-3604f2a44f88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:25:29 crc kubenswrapper[4743]: I0122 14:25:29.972006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg" (OuterVolumeSpecName: "kube-api-access-z4gbg") pod "1d89e3f5-7a04-413e-ad3d-3604f2a44f88" (UID: "1d89e3f5-7a04-413e-ad3d-3604f2a44f88"). InnerVolumeSpecName "kube-api-access-z4gbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.013551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d89e3f5-7a04-413e-ad3d-3604f2a44f88" (UID: "1d89e3f5-7a04-413e-ad3d-3604f2a44f88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.067611 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4gbg\" (UniqueName: \"kubernetes.io/projected/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-kube-api-access-z4gbg\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.067657 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.067670 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d89e3f5-7a04-413e-ad3d-3604f2a44f88-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.180851 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:30 crc kubenswrapper[4743]: I0122 14:25:30.192945 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sghr5"] Jan 22 14:25:31 crc kubenswrapper[4743]: I0122 14:25:31.758650 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" path="/var/lib/kubelet/pods/1d89e3f5-7a04-413e-ad3d-3604f2a44f88/volumes" Jan 22 14:25:41 crc kubenswrapper[4743]: I0122 14:25:41.748428 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:25:41 crc kubenswrapper[4743]: E0122 14:25:41.749104 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:53 crc kubenswrapper[4743]: I0122 14:25:53.755266 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:25:53 crc kubenswrapper[4743]: E0122 14:25:53.756574 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:25:57 crc kubenswrapper[4743]: I0122 14:25:57.099652 4743 generic.go:334] "Generic (PLEG): container finished" podID="5dca488a-cb84-4610-bf38-0f4c65c8b94a" containerID="d16199c83335fa387fbe17092549028627c56453a30de888775cce2588853a44" exitCode=0 Jan 22 14:25:57 crc kubenswrapper[4743]: I0122 14:25:57.099747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" event={"ID":"5dca488a-cb84-4610-bf38-0f4c65c8b94a","Type":"ContainerDied","Data":"d16199c83335fa387fbe17092549028627c56453a30de888775cce2588853a44"} Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.509458 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.630437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0\") pod \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.630499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory\") pod \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.630580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle\") pod \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.630632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam\") pod \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.630650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snfll\" (UniqueName: \"kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll\") pod \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\" (UID: \"5dca488a-cb84-4610-bf38-0f4c65c8b94a\") " Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.640094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5dca488a-cb84-4610-bf38-0f4c65c8b94a" (UID: "5dca488a-cb84-4610-bf38-0f4c65c8b94a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.652358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll" (OuterVolumeSpecName: "kube-api-access-snfll") pod "5dca488a-cb84-4610-bf38-0f4c65c8b94a" (UID: "5dca488a-cb84-4610-bf38-0f4c65c8b94a"). InnerVolumeSpecName "kube-api-access-snfll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.656971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5dca488a-cb84-4610-bf38-0f4c65c8b94a" (UID: "5dca488a-cb84-4610-bf38-0f4c65c8b94a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.660339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory" (OuterVolumeSpecName: "inventory") pod "5dca488a-cb84-4610-bf38-0f4c65c8b94a" (UID: "5dca488a-cb84-4610-bf38-0f4c65c8b94a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.663314 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5dca488a-cb84-4610-bf38-0f4c65c8b94a" (UID: "5dca488a-cb84-4610-bf38-0f4c65c8b94a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.732760 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.732803 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snfll\" (UniqueName: \"kubernetes.io/projected/5dca488a-cb84-4610-bf38-0f4c65c8b94a-kube-api-access-snfll\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.732813 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.732822 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:58 crc kubenswrapper[4743]: I0122 14:25:58.732831 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca488a-cb84-4610-bf38-0f4c65c8b94a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.116485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" event={"ID":"5dca488a-cb84-4610-bf38-0f4c65c8b94a","Type":"ContainerDied","Data":"0f54885718f93b43630eea9996966d0126aa2be7e233f2609ffb71a50619132e"} Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.116532 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f54885718f93b43630eea9996966d0126aa2be7e233f2609ffb71a50619132e" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.116605 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.212478 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf"] Jan 22 14:25:59 crc kubenswrapper[4743]: E0122 14:25:59.212932 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="registry-server" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.212953 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="registry-server" Jan 22 14:25:59 crc kubenswrapper[4743]: E0122 14:25:59.212976 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="extract-utilities" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.212985 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="extract-utilities" Jan 22 14:25:59 crc kubenswrapper[4743]: E0122 14:25:59.212999 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dca488a-cb84-4610-bf38-0f4c65c8b94a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.213008 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dca488a-cb84-4610-bf38-0f4c65c8b94a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 22 14:25:59 crc kubenswrapper[4743]: E0122 14:25:59.213038 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="extract-content" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.213045 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="extract-content" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.213232 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d89e3f5-7a04-413e-ad3d-3604f2a44f88" containerName="registry-server" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.213258 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dca488a-cb84-4610-bf38-0f4c65c8b94a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.214656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.216526 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.217093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.217230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.218721 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.218766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.219158 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.222415 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.230489 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf"] Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.342903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.342949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhzh\" (UniqueName: \"kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.343149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.444603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.445248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.445378 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.445534 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhzh\" (UniqueName: \"kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.445748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.445890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.446221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.446405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.446615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.447727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.449870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.450102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.450304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.450588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.451574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.451747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.453155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.462374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhzh\" (UniqueName: \"kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jjllf\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:25:59 crc kubenswrapper[4743]: I0122 14:25:59.533853 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:26:00 crc kubenswrapper[4743]: I0122 14:26:00.056513 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf"] Jan 22 14:26:00 crc kubenswrapper[4743]: I0122 14:26:00.127295 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" event={"ID":"a546459d-e713-453e-adbd-c3b9f8c7b961","Type":"ContainerStarted","Data":"43da8fff582fa0645d1e871fe036fb5923146ec3d915ba8bb58c4597bed17a88"} Jan 22 14:26:01 crc kubenswrapper[4743]: I0122 14:26:01.165840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" event={"ID":"a546459d-e713-453e-adbd-c3b9f8c7b961","Type":"ContainerStarted","Data":"234882eec363494e87446d1b157999cf3a71b2de2e6a2f01b3cc427489142407"} Jan 22 14:26:01 crc kubenswrapper[4743]: I0122 14:26:01.187817 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" podStartSLOduration=1.7098684259999999 podStartE2EDuration="2.187782992s" podCreationTimestamp="2026-01-22 14:25:59 +0000 UTC" firstStartedPulling="2026-01-22 14:26:00.055833631 +0000 UTC m=+2396.610876794" lastFinishedPulling="2026-01-22 14:26:00.533748187 +0000 UTC m=+2397.088791360" observedRunningTime="2026-01-22 14:26:01.186238161 +0000 UTC m=+2397.741281324" watchObservedRunningTime="2026-01-22 14:26:01.187782992 +0000 UTC m=+2397.742826155" Jan 22 14:26:05 crc kubenswrapper[4743]: I0122 14:26:05.747218 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:26:05 crc kubenswrapper[4743]: E0122 14:26:05.748115 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:26:18 crc kubenswrapper[4743]: I0122 14:26:18.747888 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:26:18 crc kubenswrapper[4743]: E0122 14:26:18.748701 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:26:29 crc kubenswrapper[4743]: I0122 14:26:29.747420 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:26:29 crc kubenswrapper[4743]: E0122 14:26:29.748210 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:26:40 crc kubenswrapper[4743]: I0122 14:26:40.747994 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:26:40 crc kubenswrapper[4743]: E0122 14:26:40.749361 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:26:51 crc kubenswrapper[4743]: I0122 14:26:51.748224 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:26:51 crc kubenswrapper[4743]: E0122 14:26:51.749222 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:27:03 crc kubenswrapper[4743]: I0122 14:27:03.754585 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:27:04 crc kubenswrapper[4743]: I0122 14:27:04.711361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688"} Jan 22 14:28:28 crc kubenswrapper[4743]: I0122 14:28:28.542343 4743 generic.go:334] "Generic (PLEG): container finished" podID="a546459d-e713-453e-adbd-c3b9f8c7b961" containerID="234882eec363494e87446d1b157999cf3a71b2de2e6a2f01b3cc427489142407" exitCode=0 Jan 22 14:28:28 crc kubenswrapper[4743]: I0122 14:28:28.542482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" event={"ID":"a546459d-e713-453e-adbd-c3b9f8c7b961","Type":"ContainerDied","Data":"234882eec363494e87446d1b157999cf3a71b2de2e6a2f01b3cc427489142407"} Jan 22 14:28:29 crc kubenswrapper[4743]: I0122 14:28:29.939311 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.048904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.049032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.049071 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.049115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbhzh\" (UniqueName: \"kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh\") pod \"a546459d-e713-453e-adbd-c3b9f8c7b961\" (UID: \"a546459d-e713-453e-adbd-c3b9f8c7b961\") " Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.056038 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh" (OuterVolumeSpecName: "kube-api-access-rbhzh") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "kube-api-access-rbhzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.066184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.080170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.080315 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.082763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.086775 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.087002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.090570 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory" (OuterVolumeSpecName: "inventory") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.092651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a546459d-e713-453e-adbd-c3b9f8c7b961" (UID: "a546459d-e713-453e-adbd-c3b9f8c7b961"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150852 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150890 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150902 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150914 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150925 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150935 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150947 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbhzh\" (UniqueName: \"kubernetes.io/projected/a546459d-e713-453e-adbd-c3b9f8c7b961-kube-api-access-rbhzh\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150959 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a546459d-e713-453e-adbd-c3b9f8c7b961-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.150971 4743 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a546459d-e713-453e-adbd-c3b9f8c7b961-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.571596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" event={"ID":"a546459d-e713-453e-adbd-c3b9f8c7b961","Type":"ContainerDied","Data":"43da8fff582fa0645d1e871fe036fb5923146ec3d915ba8bb58c4597bed17a88"} Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.571635 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43da8fff582fa0645d1e871fe036fb5923146ec3d915ba8bb58c4597bed17a88" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.571682 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jjllf" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.662421 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6"] Jan 22 14:28:30 crc kubenswrapper[4743]: E0122 14:28:30.662878 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a546459d-e713-453e-adbd-c3b9f8c7b961" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.662900 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546459d-e713-453e-adbd-c3b9f8c7b961" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.663121 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a546459d-e713-453e-adbd-c3b9f8c7b961" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.663843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.665640 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8mcmm" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.665867 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.666175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.666337 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.666499 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.679324 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6"] Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.762994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6jn2\" (UniqueName: \"kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.763621 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.865706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.866062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6jn2\" (UniqueName: \"kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.866218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.866307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.866480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.866948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.867125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.872449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.872781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.873193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.873189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.874096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.874150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.890447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6jn2\" (UniqueName: \"kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:30 crc kubenswrapper[4743]: I0122 14:28:30.985677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:28:31 crc kubenswrapper[4743]: I0122 14:28:31.497027 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6"] Jan 22 14:28:31 crc kubenswrapper[4743]: I0122 14:28:31.580241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" event={"ID":"65113c72-73df-4a17-b923-60f9da824feb","Type":"ContainerStarted","Data":"bc2e1fe42add0def2ec7b24caeb12fde41e9c7f3a6856b28f64df6e9bb8de379"} Jan 22 14:28:32 crc kubenswrapper[4743]: I0122 14:28:32.605599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" event={"ID":"65113c72-73df-4a17-b923-60f9da824feb","Type":"ContainerStarted","Data":"adb380472614e282521eaee24c9b41f1ee7dee6faefac9e57f67c26efbde9fd2"} Jan 22 14:28:32 crc kubenswrapper[4743]: I0122 14:28:32.628087 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" podStartSLOduration=2.203841789 podStartE2EDuration="2.628065992s" podCreationTimestamp="2026-01-22 14:28:30 +0000 UTC" firstStartedPulling="2026-01-22 14:28:31.501065563 +0000 UTC m=+2548.056108716" lastFinishedPulling="2026-01-22 14:28:31.925289756 +0000 UTC m=+2548.480332919" observedRunningTime="2026-01-22 14:28:32.623542501 +0000 UTC m=+2549.178585684" watchObservedRunningTime="2026-01-22 14:28:32.628065992 +0000 UTC m=+2549.183109155" Jan 22 14:29:11 crc kubenswrapper[4743]: E0122 14:29:11.004920 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.696624 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.699010 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.711876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.741384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.741482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.741556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s24dt\" (UniqueName: \"kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.843498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.843596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s24dt\" (UniqueName: \"kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.843761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.844088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.844186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:21 crc kubenswrapper[4743]: I0122 14:29:21.873498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s24dt\" (UniqueName: \"kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt\") pod \"redhat-operators-tcfdt\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:22 crc kubenswrapper[4743]: I0122 14:29:22.033032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:22 crc kubenswrapper[4743]: I0122 14:29:22.519443 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:23 crc kubenswrapper[4743]: I0122 14:29:23.032175 4743 generic.go:334] "Generic (PLEG): container finished" podID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerID="c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62" exitCode=0 Jan 22 14:29:23 crc kubenswrapper[4743]: I0122 14:29:23.032489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerDied","Data":"c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62"} Jan 22 14:29:23 crc kubenswrapper[4743]: I0122 14:29:23.032523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerStarted","Data":"ef3441a7ea4fcf27e2c3baace78008a4e13d2bd212ca7e32bd684fa47207465a"} Jan 22 14:29:25 crc kubenswrapper[4743]: I0122 14:29:25.048242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerStarted","Data":"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d"} Jan 22 14:29:27 crc kubenswrapper[4743]: I0122 14:29:27.082246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerDied","Data":"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d"} Jan 22 14:29:27 crc kubenswrapper[4743]: I0122 14:29:27.082343 4743 generic.go:334] "Generic (PLEG): container finished" podID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerID="e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d" exitCode=0 Jan 22 14:29:29 crc kubenswrapper[4743]: I0122 14:29:29.105580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerStarted","Data":"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e"} Jan 22 14:29:29 crc kubenswrapper[4743]: I0122 14:29:29.157960 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tcfdt" podStartSLOduration=2.850450852 podStartE2EDuration="8.157927284s" podCreationTimestamp="2026-01-22 14:29:21 +0000 UTC" firstStartedPulling="2026-01-22 14:29:23.033840403 +0000 UTC m=+2599.588883556" lastFinishedPulling="2026-01-22 14:29:28.341316835 +0000 UTC m=+2604.896359988" observedRunningTime="2026-01-22 14:29:29.134769211 +0000 UTC m=+2605.689812454" watchObservedRunningTime="2026-01-22 14:29:29.157927284 +0000 UTC m=+2605.712970447" Jan 22 14:29:30 crc kubenswrapper[4743]: I0122 14:29:30.049676 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:29:30 crc kubenswrapper[4743]: I0122 14:29:30.050297 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:29:32 crc kubenswrapper[4743]: I0122 14:29:32.034245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:32 crc kubenswrapper[4743]: I0122 14:29:32.034553 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:33 crc kubenswrapper[4743]: I0122 14:29:33.083125 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcfdt" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="registry-server" probeResult="failure" output=< Jan 22 14:29:33 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 14:29:33 crc kubenswrapper[4743]: > Jan 22 14:29:42 crc kubenswrapper[4743]: I0122 14:29:42.088713 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:42 crc kubenswrapper[4743]: I0122 14:29:42.135629 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:42 crc kubenswrapper[4743]: I0122 14:29:42.324387 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.282699 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tcfdt" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="registry-server" containerID="cri-o://29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e" gracePeriod=2 Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.732404 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.886004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content\") pod \"77440e96-59f2-42e1-a73e-6a94bef2c67a\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.886140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities\") pod \"77440e96-59f2-42e1-a73e-6a94bef2c67a\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.886231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s24dt\" (UniqueName: \"kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt\") pod \"77440e96-59f2-42e1-a73e-6a94bef2c67a\" (UID: \"77440e96-59f2-42e1-a73e-6a94bef2c67a\") " Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.887320 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities" (OuterVolumeSpecName: "utilities") pod "77440e96-59f2-42e1-a73e-6a94bef2c67a" (UID: "77440e96-59f2-42e1-a73e-6a94bef2c67a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.896072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt" (OuterVolumeSpecName: "kube-api-access-s24dt") pod "77440e96-59f2-42e1-a73e-6a94bef2c67a" (UID: "77440e96-59f2-42e1-a73e-6a94bef2c67a"). InnerVolumeSpecName "kube-api-access-s24dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.988656 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:43 crc kubenswrapper[4743]: I0122 14:29:43.988959 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s24dt\" (UniqueName: \"kubernetes.io/projected/77440e96-59f2-42e1-a73e-6a94bef2c67a-kube-api-access-s24dt\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.000099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77440e96-59f2-42e1-a73e-6a94bef2c67a" (UID: "77440e96-59f2-42e1-a73e-6a94bef2c67a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.091399 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77440e96-59f2-42e1-a73e-6a94bef2c67a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.293604 4743 generic.go:334] "Generic (PLEG): container finished" podID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerID="29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e" exitCode=0 Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.293633 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcfdt" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.293652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerDied","Data":"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e"} Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.294168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcfdt" event={"ID":"77440e96-59f2-42e1-a73e-6a94bef2c67a","Type":"ContainerDied","Data":"ef3441a7ea4fcf27e2c3baace78008a4e13d2bd212ca7e32bd684fa47207465a"} Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.294193 4743 scope.go:117] "RemoveContainer" containerID="29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.335926 4743 scope.go:117] "RemoveContainer" containerID="e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.337052 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.356478 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tcfdt"] Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.356643 4743 scope.go:117] "RemoveContainer" containerID="c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.397937 4743 scope.go:117] "RemoveContainer" containerID="29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e" Jan 22 14:29:44 crc kubenswrapper[4743]: E0122 14:29:44.398401 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e\": container with ID starting with 29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e not found: ID does not exist" containerID="29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.398444 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e"} err="failed to get container status \"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e\": rpc error: code = NotFound desc = could not find container \"29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e\": container with ID starting with 29b593f9e754b203e0208b16bd2bf263b7f76130bf0230130350582c5ea91a1e not found: ID does not exist" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.398470 4743 scope.go:117] "RemoveContainer" containerID="e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d" Jan 22 14:29:44 crc kubenswrapper[4743]: E0122 14:29:44.398853 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d\": container with ID starting with e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d not found: ID does not exist" containerID="e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.398899 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d"} err="failed to get container status \"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d\": rpc error: code = NotFound desc = could not find container \"e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d\": container with ID starting with e3be79dbc220ce5492e565cbaa1c29d252370b06d8a39ae62e3c636c8b81e51d not found: ID does not exist" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.398930 4743 scope.go:117] "RemoveContainer" containerID="c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62" Jan 22 14:29:44 crc kubenswrapper[4743]: E0122 14:29:44.399272 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62\": container with ID starting with c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62 not found: ID does not exist" containerID="c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62" Jan 22 14:29:44 crc kubenswrapper[4743]: I0122 14:29:44.399309 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62"} err="failed to get container status \"c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62\": rpc error: code = NotFound desc = could not find container \"c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62\": container with ID starting with c849b02997f8aad1cf5b4c113b35b6ecd9e118f25aeef8226a72eef7147e9f62 not found: ID does not exist" Jan 22 14:29:45 crc kubenswrapper[4743]: I0122 14:29:45.762186 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" path="/var/lib/kubelet/pods/77440e96-59f2-42e1-a73e-6a94bef2c67a/volumes" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.049454 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.049975 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.156639 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt"] Jan 22 14:30:00 crc kubenswrapper[4743]: E0122 14:30:00.157114 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="extract-content" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.157136 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="extract-content" Jan 22 14:30:00 crc kubenswrapper[4743]: E0122 14:30:00.157158 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="extract-utilities" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.157166 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="extract-utilities" Jan 22 14:30:00 crc kubenswrapper[4743]: E0122 14:30:00.157219 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.157230 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.157471 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="77440e96-59f2-42e1-a73e-6a94bef2c67a" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.158293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.160613 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.161074 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.166517 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt"] Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.205738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.207926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvss\" (UniqueName: \"kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.208062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.309729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.309925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.310018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvss\" (UniqueName: \"kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.310769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.319932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.326548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvss\" (UniqueName: \"kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss\") pod \"collect-profiles-29484870-bbxqt\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.513920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:00 crc kubenswrapper[4743]: I0122 14:30:00.956918 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt"] Jan 22 14:30:01 crc kubenswrapper[4743]: I0122 14:30:01.511584 4743 generic.go:334] "Generic (PLEG): container finished" podID="2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" containerID="d339a2a8f516818e1f491bf7334042a589beae67161ffeb6d6ad23c13f6b5e25" exitCode=0 Jan 22 14:30:01 crc kubenswrapper[4743]: I0122 14:30:01.511734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" event={"ID":"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73","Type":"ContainerDied","Data":"d339a2a8f516818e1f491bf7334042a589beae67161ffeb6d6ad23c13f6b5e25"} Jan 22 14:30:01 crc kubenswrapper[4743]: I0122 14:30:01.512581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" event={"ID":"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73","Type":"ContainerStarted","Data":"46240321c68289043ebad1d7e732edc01555a5d090f4c8fb4c30e95e23688240"} Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.888922 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.968702 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume\") pod \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.968830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume\") pod \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.968911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvss\" (UniqueName: \"kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss\") pod \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\" (UID: \"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73\") " Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.969485 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" (UID: "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.974537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss" (OuterVolumeSpecName: "kube-api-access-gpvss") pod "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" (UID: "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73"). InnerVolumeSpecName "kube-api-access-gpvss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:30:02 crc kubenswrapper[4743]: I0122 14:30:02.975355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" (UID: "2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.071089 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.071125 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvss\" (UniqueName: \"kubernetes.io/projected/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-kube-api-access-gpvss\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.071135 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.543949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" event={"ID":"2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73","Type":"ContainerDied","Data":"46240321c68289043ebad1d7e732edc01555a5d090f4c8fb4c30e95e23688240"} Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.544281 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46240321c68289043ebad1d7e732edc01555a5d090f4c8fb4c30e95e23688240" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.544367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-bbxqt" Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.963666 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc"] Jan 22 14:30:03 crc kubenswrapper[4743]: I0122 14:30:03.974222 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484825-rkvnc"] Jan 22 14:30:05 crc kubenswrapper[4743]: I0122 14:30:05.757464 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67e22a2-dc2b-4582-bfe0-7afff25995fb" path="/var/lib/kubelet/pods/d67e22a2-dc2b-4582-bfe0-7afff25995fb/volumes" Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.048814 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.049418 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.049474 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.050330 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.050398 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688" gracePeriod=600 Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.765307 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688" exitCode=0 Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.765389 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688"} Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.766117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111"} Jan 22 14:30:30 crc kubenswrapper[4743]: I0122 14:30:30.766140 4743 scope.go:117] "RemoveContainer" containerID="4e338cfb7048ed5b16851b239153eb46a9d70fc49b34c9091316d3e3baa03fe0" Jan 22 14:30:35 crc kubenswrapper[4743]: I0122 14:30:35.006092 4743 scope.go:117] "RemoveContainer" containerID="a90b7e20a7ddf78fed89d0684b6fbdb934149fca9e3192dcb7886a107c6eeed1" Jan 22 14:31:11 crc kubenswrapper[4743]: I0122 14:31:11.163776 4743 generic.go:334] "Generic (PLEG): container finished" podID="65113c72-73df-4a17-b923-60f9da824feb" containerID="adb380472614e282521eaee24c9b41f1ee7dee6faefac9e57f67c26efbde9fd2" exitCode=0 Jan 22 14:31:11 crc kubenswrapper[4743]: I0122 14:31:11.163870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" event={"ID":"65113c72-73df-4a17-b923-60f9da824feb","Type":"ContainerDied","Data":"adb380472614e282521eaee24c9b41f1ee7dee6faefac9e57f67c26efbde9fd2"} Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.596866 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.715742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.715846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.715925 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.715980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.716041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6jn2\" (UniqueName: \"kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.716123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.716200 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory\") pod \"65113c72-73df-4a17-b923-60f9da824feb\" (UID: \"65113c72-73df-4a17-b923-60f9da824feb\") " Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.723365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.723979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2" (OuterVolumeSpecName: "kube-api-access-s6jn2") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "kube-api-access-s6jn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.751840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.752612 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.757616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.760776 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory" (OuterVolumeSpecName: "inventory") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.761429 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "65113c72-73df-4a17-b923-60f9da824feb" (UID: "65113c72-73df-4a17-b923-60f9da824feb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818862 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818901 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-inventory\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818912 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818921 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818929 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818940 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/65113c72-73df-4a17-b923-60f9da824feb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:12 crc kubenswrapper[4743]: I0122 14:31:12.818950 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6jn2\" (UniqueName: \"kubernetes.io/projected/65113c72-73df-4a17-b923-60f9da824feb-kube-api-access-s6jn2\") on node \"crc\" DevicePath \"\"" Jan 22 14:31:13 crc kubenswrapper[4743]: I0122 14:31:13.184523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" event={"ID":"65113c72-73df-4a17-b923-60f9da824feb","Type":"ContainerDied","Data":"bc2e1fe42add0def2ec7b24caeb12fde41e9c7f3a6856b28f64df6e9bb8de379"} Jan 22 14:31:13 crc kubenswrapper[4743]: I0122 14:31:13.184865 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2e1fe42add0def2ec7b24caeb12fde41e9c7f3a6856b28f64df6e9bb8de379" Jan 22 14:31:13 crc kubenswrapper[4743]: I0122 14:31:13.184551 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.570393 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 22 14:32:15 crc kubenswrapper[4743]: E0122 14:32:15.571548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" containerName="collect-profiles" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.571563 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" containerName="collect-profiles" Jan 22 14:32:15 crc kubenswrapper[4743]: E0122 14:32:15.571608 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65113c72-73df-4a17-b923-60f9da824feb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.571616 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="65113c72-73df-4a17-b923-60f9da824feb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.571803 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="65113c72-73df-4a17-b923-60f9da824feb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.571823 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a07e37d-94bd-4e2d-9fe9-1c35f28a3f73" containerName="collect-profiles" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.572598 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.576191 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.576338 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.576378 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.576346 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8whlt" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.581285 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688585 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb4df\" (UniqueName: \"kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.688985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.689037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb4df\" (UniqueName: \"kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.792977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.793047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.793620 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.793673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.794008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.794412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.794635 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.801685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.802047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.802685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.819657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb4df\" (UniqueName: \"kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.836986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " pod="openstack/tempest-tests-tempest" Jan 22 14:32:15 crc kubenswrapper[4743]: I0122 14:32:15.905699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 22 14:32:16 crc kubenswrapper[4743]: I0122 14:32:16.370979 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 22 14:32:16 crc kubenswrapper[4743]: W0122 14:32:16.375527 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca0d9c1_5628_4b93_9696_f9d455c70f31.slice/crio-5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254 WatchSource:0}: Error finding container 5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254: Status 404 returned error can't find the container with id 5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254 Jan 22 14:32:16 crc kubenswrapper[4743]: I0122 14:32:16.378984 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:32:16 crc kubenswrapper[4743]: I0122 14:32:16.836080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dca0d9c1-5628-4b93-9696-f9d455c70f31","Type":"ContainerStarted","Data":"5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254"} Jan 22 14:32:30 crc kubenswrapper[4743]: I0122 14:32:30.049327 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:32:30 crc kubenswrapper[4743]: I0122 14:32:30.049905 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.109975 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.113696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.126835 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.171201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.171242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.171890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgz2t\" (UniqueName: \"kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.274187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.274229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.274338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgz2t\" (UniqueName: \"kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.274727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.274852 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.310830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgz2t\" (UniqueName: \"kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t\") pod \"community-operators-xp7gh\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:39 crc kubenswrapper[4743]: I0122 14:32:39.450808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.500270 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.502272 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.511863 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.608870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.608935 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.608999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b656\" (UniqueName: \"kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.713062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.713147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.713259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b656\" (UniqueName: \"kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.713816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.713815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.733593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b656\" (UniqueName: \"kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656\") pod \"redhat-marketplace-dsbvq\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:40 crc kubenswrapper[4743]: I0122 14:32:40.842724 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:43 crc kubenswrapper[4743]: E0122 14:32:43.193749 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 22 14:32:43 crc kubenswrapper[4743]: E0122 14:32:43.194540 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lb4df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(dca0d9c1-5628-4b93-9696-f9d455c70f31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:32:43 crc kubenswrapper[4743]: E0122 14:32:43.195733 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="dca0d9c1-5628-4b93-9696-f9d455c70f31" Jan 22 14:32:43 crc kubenswrapper[4743]: I0122 14:32:43.587817 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:32:43 crc kubenswrapper[4743]: I0122 14:32:43.659635 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:43 crc kubenswrapper[4743]: W0122 14:32:43.680051 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd977bd_9df7_42e9_9329_6e8d2336f22a.slice/crio-2f11f020da50dd0bfb3a71e8c7b5ce9524208817bef76d3097216e8e286fafe2 WatchSource:0}: Error finding container 2f11f020da50dd0bfb3a71e8c7b5ce9524208817bef76d3097216e8e286fafe2: Status 404 returned error can't find the container with id 2f11f020da50dd0bfb3a71e8c7b5ce9524208817bef76d3097216e8e286fafe2 Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.085090 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerID="850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b" exitCode=0 Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.085175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerDied","Data":"850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b"} Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.085207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerStarted","Data":"17b85ee9ad75fc9398346307cc24418f2d5d9ae09113148f78247eb0714ece72"} Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.093953 4743 generic.go:334] "Generic (PLEG): container finished" podID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerID="ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e" exitCode=0 Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.094574 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerDied","Data":"ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e"} Jan 22 14:32:44 crc kubenswrapper[4743]: I0122 14:32:44.094608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerStarted","Data":"2f11f020da50dd0bfb3a71e8c7b5ce9524208817bef76d3097216e8e286fafe2"} Jan 22 14:32:44 crc kubenswrapper[4743]: E0122 14:32:44.096762 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="dca0d9c1-5628-4b93-9696-f9d455c70f31" Jan 22 14:32:45 crc kubenswrapper[4743]: I0122 14:32:45.115760 4743 generic.go:334] "Generic (PLEG): container finished" podID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerID="810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a" exitCode=0 Jan 22 14:32:45 crc kubenswrapper[4743]: I0122 14:32:45.115829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerDied","Data":"810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a"} Jan 22 14:32:46 crc kubenswrapper[4743]: I0122 14:32:46.127061 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerID="01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb" exitCode=0 Jan 22 14:32:46 crc kubenswrapper[4743]: I0122 14:32:46.127645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerDied","Data":"01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb"} Jan 22 14:32:46 crc kubenswrapper[4743]: I0122 14:32:46.130884 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerStarted","Data":"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1"} Jan 22 14:32:46 crc kubenswrapper[4743]: I0122 14:32:46.176071 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dsbvq" podStartSLOduration=4.765141092 podStartE2EDuration="6.176046948s" podCreationTimestamp="2026-01-22 14:32:40 +0000 UTC" firstStartedPulling="2026-01-22 14:32:44.097615634 +0000 UTC m=+2800.652658807" lastFinishedPulling="2026-01-22 14:32:45.5085215 +0000 UTC m=+2802.063564663" observedRunningTime="2026-01-22 14:32:46.165255708 +0000 UTC m=+2802.720298871" watchObservedRunningTime="2026-01-22 14:32:46.176046948 +0000 UTC m=+2802.731090121" Jan 22 14:32:47 crc kubenswrapper[4743]: I0122 14:32:47.147198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerStarted","Data":"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5"} Jan 22 14:32:47 crc kubenswrapper[4743]: I0122 14:32:47.186492 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xp7gh" podStartSLOduration=5.73903404 podStartE2EDuration="8.186474431s" podCreationTimestamp="2026-01-22 14:32:39 +0000 UTC" firstStartedPulling="2026-01-22 14:32:44.087173733 +0000 UTC m=+2800.642216916" lastFinishedPulling="2026-01-22 14:32:46.534614144 +0000 UTC m=+2803.089657307" observedRunningTime="2026-01-22 14:32:47.179421341 +0000 UTC m=+2803.734464504" watchObservedRunningTime="2026-01-22 14:32:47.186474431 +0000 UTC m=+2803.741517594" Jan 22 14:32:49 crc kubenswrapper[4743]: I0122 14:32:49.451628 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:49 crc kubenswrapper[4743]: I0122 14:32:49.452938 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:49 crc kubenswrapper[4743]: I0122 14:32:49.494361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:50 crc kubenswrapper[4743]: I0122 14:32:50.848368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:50 crc kubenswrapper[4743]: I0122 14:32:50.848732 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:50 crc kubenswrapper[4743]: I0122 14:32:50.959255 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:51 crc kubenswrapper[4743]: I0122 14:32:51.221140 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:52 crc kubenswrapper[4743]: I0122 14:32:52.105353 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:53 crc kubenswrapper[4743]: I0122 14:32:53.201895 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dsbvq" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="registry-server" containerID="cri-o://693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1" gracePeriod=2 Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.158401 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.213202 4743 generic.go:334] "Generic (PLEG): container finished" podID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerID="693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1" exitCode=0 Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.213269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerDied","Data":"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1"} Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.213314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsbvq" event={"ID":"9dd977bd-9df7-42e9-9329-6e8d2336f22a","Type":"ContainerDied","Data":"2f11f020da50dd0bfb3a71e8c7b5ce9524208817bef76d3097216e8e286fafe2"} Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.213333 4743 scope.go:117] "RemoveContainer" containerID="693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.213504 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsbvq" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.233259 4743 scope.go:117] "RemoveContainer" containerID="810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.253846 4743 scope.go:117] "RemoveContainer" containerID="ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.273397 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b656\" (UniqueName: \"kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656\") pod \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.273773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities\") pod \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.273989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content\") pod \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\" (UID: \"9dd977bd-9df7-42e9-9329-6e8d2336f22a\") " Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.275221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities" (OuterVolumeSpecName: "utilities") pod "9dd977bd-9df7-42e9-9329-6e8d2336f22a" (UID: "9dd977bd-9df7-42e9-9329-6e8d2336f22a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.280650 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656" (OuterVolumeSpecName: "kube-api-access-2b656") pod "9dd977bd-9df7-42e9-9329-6e8d2336f22a" (UID: "9dd977bd-9df7-42e9-9329-6e8d2336f22a"). InnerVolumeSpecName "kube-api-access-2b656". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.298408 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dd977bd-9df7-42e9-9329-6e8d2336f22a" (UID: "9dd977bd-9df7-42e9-9329-6e8d2336f22a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.300299 4743 scope.go:117] "RemoveContainer" containerID="693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1" Jan 22 14:32:54 crc kubenswrapper[4743]: E0122 14:32:54.301205 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1\": container with ID starting with 693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1 not found: ID does not exist" containerID="693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.301241 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1"} err="failed to get container status \"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1\": rpc error: code = NotFound desc = could not find container \"693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1\": container with ID starting with 693d0b98a30b886de99227e66bb663a07abed94c0b57fd90830fe3a86b5352e1 not found: ID does not exist" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.301262 4743 scope.go:117] "RemoveContainer" containerID="810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a" Jan 22 14:32:54 crc kubenswrapper[4743]: E0122 14:32:54.301625 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a\": container with ID starting with 810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a not found: ID does not exist" containerID="810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.301663 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a"} err="failed to get container status \"810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a\": rpc error: code = NotFound desc = could not find container \"810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a\": container with ID starting with 810957081c7c129836bb2741b5c9eae7061bd1714896c57efbf657db34c6bc5a not found: ID does not exist" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.301691 4743 scope.go:117] "RemoveContainer" containerID="ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e" Jan 22 14:32:54 crc kubenswrapper[4743]: E0122 14:32:54.302030 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e\": container with ID starting with ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e not found: ID does not exist" containerID="ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.302060 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e"} err="failed to get container status \"ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e\": rpc error: code = NotFound desc = could not find container \"ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e\": container with ID starting with ca5c1fc23c418bf7338e6785a3315c30c1c084e2124dfe658bad7e429513955e not found: ID does not exist" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.376332 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.376380 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b656\" (UniqueName: \"kubernetes.io/projected/9dd977bd-9df7-42e9-9329-6e8d2336f22a-kube-api-access-2b656\") on node \"crc\" DevicePath \"\"" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.376396 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd977bd-9df7-42e9-9329-6e8d2336f22a-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.555423 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:54 crc kubenswrapper[4743]: I0122 14:32:54.568712 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsbvq"] Jan 22 14:32:55 crc kubenswrapper[4743]: I0122 14:32:55.764687 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" path="/var/lib/kubelet/pods/9dd977bd-9df7-42e9-9329-6e8d2336f22a/volumes" Jan 22 14:32:59 crc kubenswrapper[4743]: I0122 14:32:59.256947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dca0d9c1-5628-4b93-9696-f9d455c70f31","Type":"ContainerStarted","Data":"1568d211e2187ed9598521851ed4623ff5911d54748b07051f13edcc795b98a6"} Jan 22 14:32:59 crc kubenswrapper[4743]: I0122 14:32:59.280452 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.447607671 podStartE2EDuration="45.280428693s" podCreationTimestamp="2026-01-22 14:32:14 +0000 UTC" firstStartedPulling="2026-01-22 14:32:16.378670655 +0000 UTC m=+2772.933713808" lastFinishedPulling="2026-01-22 14:32:58.211491667 +0000 UTC m=+2814.766534830" observedRunningTime="2026-01-22 14:32:59.275000057 +0000 UTC m=+2815.830043220" watchObservedRunningTime="2026-01-22 14:32:59.280428693 +0000 UTC m=+2815.835471856" Jan 22 14:32:59 crc kubenswrapper[4743]: I0122 14:32:59.518734 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:32:59 crc kubenswrapper[4743]: I0122 14:32:59.583646 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.049592 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.049970 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.266140 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xp7gh" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="registry-server" containerID="cri-o://511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5" gracePeriod=2 Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.757664 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.809132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content\") pod \"0fb03fb7-cddd-456c-a51d-08c709fa9307\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.865688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fb03fb7-cddd-456c-a51d-08c709fa9307" (UID: "0fb03fb7-cddd-456c-a51d-08c709fa9307"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.911056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgz2t\" (UniqueName: \"kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t\") pod \"0fb03fb7-cddd-456c-a51d-08c709fa9307\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.911219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities\") pod \"0fb03fb7-cddd-456c-a51d-08c709fa9307\" (UID: \"0fb03fb7-cddd-456c-a51d-08c709fa9307\") " Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.911759 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.912590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities" (OuterVolumeSpecName: "utilities") pod "0fb03fb7-cddd-456c-a51d-08c709fa9307" (UID: "0fb03fb7-cddd-456c-a51d-08c709fa9307"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:33:00 crc kubenswrapper[4743]: I0122 14:33:00.923296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t" (OuterVolumeSpecName: "kube-api-access-hgz2t") pod "0fb03fb7-cddd-456c-a51d-08c709fa9307" (UID: "0fb03fb7-cddd-456c-a51d-08c709fa9307"). InnerVolumeSpecName "kube-api-access-hgz2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.014198 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgz2t\" (UniqueName: \"kubernetes.io/projected/0fb03fb7-cddd-456c-a51d-08c709fa9307-kube-api-access-hgz2t\") on node \"crc\" DevicePath \"\"" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.014230 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb03fb7-cddd-456c-a51d-08c709fa9307-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.281045 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerID="511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5" exitCode=0 Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.281473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerDied","Data":"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5"} Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.281514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp7gh" event={"ID":"0fb03fb7-cddd-456c-a51d-08c709fa9307","Type":"ContainerDied","Data":"17b85ee9ad75fc9398346307cc24418f2d5d9ae09113148f78247eb0714ece72"} Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.281543 4743 scope.go:117] "RemoveContainer" containerID="511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.281766 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp7gh" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.338343 4743 scope.go:117] "RemoveContainer" containerID="01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.353306 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.361469 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xp7gh"] Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.399298 4743 scope.go:117] "RemoveContainer" containerID="850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.496731 4743 scope.go:117] "RemoveContainer" containerID="511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5" Jan 22 14:33:01 crc kubenswrapper[4743]: E0122 14:33:01.497118 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5\": container with ID starting with 511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5 not found: ID does not exist" containerID="511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.497164 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5"} err="failed to get container status \"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5\": rpc error: code = NotFound desc = could not find container \"511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5\": container with ID starting with 511e064e3eb37233c98a4b4f7b5fd7242454a8ab6d1edd888b62f71eb1b90ee5 not found: ID does not exist" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.497190 4743 scope.go:117] "RemoveContainer" containerID="01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb" Jan 22 14:33:01 crc kubenswrapper[4743]: E0122 14:33:01.497518 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb\": container with ID starting with 01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb not found: ID does not exist" containerID="01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.497538 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb"} err="failed to get container status \"01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb\": rpc error: code = NotFound desc = could not find container \"01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb\": container with ID starting with 01018610be53c50d106866e039f109d244aff30e545d161d75d02a6eee56d4eb not found: ID does not exist" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.497550 4743 scope.go:117] "RemoveContainer" containerID="850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b" Jan 22 14:33:01 crc kubenswrapper[4743]: E0122 14:33:01.497764 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b\": container with ID starting with 850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b not found: ID does not exist" containerID="850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.497781 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b"} err="failed to get container status \"850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b\": rpc error: code = NotFound desc = could not find container \"850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b\": container with ID starting with 850d22ef7bba6b262e505a96d47350b57ba004bbac5eaa94996c2727ede10d4b not found: ID does not exist" Jan 22 14:33:01 crc kubenswrapper[4743]: I0122 14:33:01.756130 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" path="/var/lib/kubelet/pods/0fb03fb7-cddd-456c-a51d-08c709fa9307/volumes" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.049929 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.050633 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.050711 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.052094 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.052219 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" gracePeriod=600 Jan 22 14:33:30 crc kubenswrapper[4743]: E0122 14:33:30.187682 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.549642 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" exitCode=0 Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.549706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111"} Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.549809 4743 scope.go:117] "RemoveContainer" containerID="61bf6edec840dc0b1f7bf3e3135ac897251d422ba0aaf1a740277a160551f688" Jan 22 14:33:30 crc kubenswrapper[4743]: I0122 14:33:30.550650 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:33:30 crc kubenswrapper[4743]: E0122 14:33:30.551157 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:33:44 crc kubenswrapper[4743]: I0122 14:33:44.747980 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:33:44 crc kubenswrapper[4743]: E0122 14:33:44.748813 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:33:57 crc kubenswrapper[4743]: I0122 14:33:57.747506 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:33:57 crc kubenswrapper[4743]: E0122 14:33:57.748228 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:34:11 crc kubenswrapper[4743]: I0122 14:34:11.748109 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:34:11 crc kubenswrapper[4743]: E0122 14:34:11.749357 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:34:25 crc kubenswrapper[4743]: I0122 14:34:25.746762 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:34:25 crc kubenswrapper[4743]: E0122 14:34:25.747615 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:34:40 crc kubenswrapper[4743]: I0122 14:34:40.747122 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:34:40 crc kubenswrapper[4743]: E0122 14:34:40.747875 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:34:55 crc kubenswrapper[4743]: I0122 14:34:55.747590 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:34:55 crc kubenswrapper[4743]: E0122 14:34:55.748319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:35:09 crc kubenswrapper[4743]: I0122 14:35:09.748054 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:35:09 crc kubenswrapper[4743]: E0122 14:35:09.749091 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:35:23 crc kubenswrapper[4743]: I0122 14:35:23.769180 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:35:23 crc kubenswrapper[4743]: E0122 14:35:23.779714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:35:34 crc kubenswrapper[4743]: I0122 14:35:34.748188 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:35:34 crc kubenswrapper[4743]: E0122 14:35:34.749067 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.997501 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998193 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="extract-utilities" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998206 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="extract-utilities" Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998235 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="extract-content" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998241 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="extract-content" Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998257 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="extract-utilities" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998263 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="extract-utilities" Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998273 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="extract-content" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998279 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="extract-content" Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998291 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998301 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: E0122 14:35:35.998312 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998321 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998508 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd977bd-9df7-42e9-9329-6e8d2336f22a" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.998535 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb03fb7-cddd-456c-a51d-08c709fa9307" containerName="registry-server" Jan 22 14:35:35 crc kubenswrapper[4743]: I0122 14:35:35.999896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.007889 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.091180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjd7t\" (UniqueName: \"kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.091225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.091317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.193351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.193571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjd7t\" (UniqueName: \"kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.193617 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.194067 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.194146 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.216947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjd7t\" (UniqueName: \"kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t\") pod \"certified-operators-vlbks\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.326549 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:36 crc kubenswrapper[4743]: I0122 14:35:36.877604 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:37 crc kubenswrapper[4743]: I0122 14:35:37.760536 4743 generic.go:334] "Generic (PLEG): container finished" podID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerID="7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030" exitCode=0 Jan 22 14:35:37 crc kubenswrapper[4743]: I0122 14:35:37.761008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerDied","Data":"7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030"} Jan 22 14:35:37 crc kubenswrapper[4743]: I0122 14:35:37.761097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerStarted","Data":"ad30cef09ec827c1125bcce749228a49117a33eb57b0013b3d238bc1d416f97f"} Jan 22 14:35:38 crc kubenswrapper[4743]: I0122 14:35:38.770582 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerStarted","Data":"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23"} Jan 22 14:35:39 crc kubenswrapper[4743]: I0122 14:35:39.789956 4743 generic.go:334] "Generic (PLEG): container finished" podID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerID="80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23" exitCode=0 Jan 22 14:35:39 crc kubenswrapper[4743]: I0122 14:35:39.790008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerDied","Data":"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23"} Jan 22 14:35:40 crc kubenswrapper[4743]: I0122 14:35:40.802215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerStarted","Data":"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1"} Jan 22 14:35:45 crc kubenswrapper[4743]: I0122 14:35:45.748394 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:35:45 crc kubenswrapper[4743]: E0122 14:35:45.749437 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.327475 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.327871 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.377001 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.398346 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vlbks" podStartSLOduration=8.989549448 podStartE2EDuration="11.398324549s" podCreationTimestamp="2026-01-22 14:35:35 +0000 UTC" firstStartedPulling="2026-01-22 14:35:37.764615702 +0000 UTC m=+2974.319658865" lastFinishedPulling="2026-01-22 14:35:40.173390803 +0000 UTC m=+2976.728433966" observedRunningTime="2026-01-22 14:35:40.823475157 +0000 UTC m=+2977.378518330" watchObservedRunningTime="2026-01-22 14:35:46.398324549 +0000 UTC m=+2982.953367712" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.898498 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:46 crc kubenswrapper[4743]: I0122 14:35:46.954847 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:48 crc kubenswrapper[4743]: I0122 14:35:48.871201 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vlbks" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="registry-server" containerID="cri-o://f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1" gracePeriod=2 Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.416472 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.493007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities\") pod \"84ea4890-26cb-4257-acad-40f72d6dc19e\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.493162 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjd7t\" (UniqueName: \"kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t\") pod \"84ea4890-26cb-4257-acad-40f72d6dc19e\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.493276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content\") pod \"84ea4890-26cb-4257-acad-40f72d6dc19e\" (UID: \"84ea4890-26cb-4257-acad-40f72d6dc19e\") " Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.494277 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities" (OuterVolumeSpecName: "utilities") pod "84ea4890-26cb-4257-acad-40f72d6dc19e" (UID: "84ea4890-26cb-4257-acad-40f72d6dc19e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.500452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t" (OuterVolumeSpecName: "kube-api-access-hjd7t") pod "84ea4890-26cb-4257-acad-40f72d6dc19e" (UID: "84ea4890-26cb-4257-acad-40f72d6dc19e"). InnerVolumeSpecName "kube-api-access-hjd7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.543552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84ea4890-26cb-4257-acad-40f72d6dc19e" (UID: "84ea4890-26cb-4257-acad-40f72d6dc19e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.595311 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.595354 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84ea4890-26cb-4257-acad-40f72d6dc19e-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.595364 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjd7t\" (UniqueName: \"kubernetes.io/projected/84ea4890-26cb-4257-acad-40f72d6dc19e-kube-api-access-hjd7t\") on node \"crc\" DevicePath \"\"" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.880813 4743 generic.go:334] "Generic (PLEG): container finished" podID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerID="f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1" exitCode=0 Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.880860 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlbks" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.880865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerDied","Data":"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1"} Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.880979 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlbks" event={"ID":"84ea4890-26cb-4257-acad-40f72d6dc19e","Type":"ContainerDied","Data":"ad30cef09ec827c1125bcce749228a49117a33eb57b0013b3d238bc1d416f97f"} Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.881004 4743 scope.go:117] "RemoveContainer" containerID="f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.906994 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.914222 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vlbks"] Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.915669 4743 scope.go:117] "RemoveContainer" containerID="80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23" Jan 22 14:35:49 crc kubenswrapper[4743]: I0122 14:35:49.945854 4743 scope.go:117] "RemoveContainer" containerID="7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.005191 4743 scope.go:117] "RemoveContainer" containerID="f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1" Jan 22 14:35:50 crc kubenswrapper[4743]: E0122 14:35:50.005595 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1\": container with ID starting with f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1 not found: ID does not exist" containerID="f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.005626 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1"} err="failed to get container status \"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1\": rpc error: code = NotFound desc = could not find container \"f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1\": container with ID starting with f71d4846583408ce90b48d98d464ce8ab47b344e568055f4e07f22e9e98834d1 not found: ID does not exist" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.005647 4743 scope.go:117] "RemoveContainer" containerID="80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23" Jan 22 14:35:50 crc kubenswrapper[4743]: E0122 14:35:50.006114 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23\": container with ID starting with 80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23 not found: ID does not exist" containerID="80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.006140 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23"} err="failed to get container status \"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23\": rpc error: code = NotFound desc = could not find container \"80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23\": container with ID starting with 80d40b0adcc5d43162af39f4dba1b502c910eba88f55f5e4f66922b948674f23 not found: ID does not exist" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.006153 4743 scope.go:117] "RemoveContainer" containerID="7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030" Jan 22 14:35:50 crc kubenswrapper[4743]: E0122 14:35:50.006384 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030\": container with ID starting with 7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030 not found: ID does not exist" containerID="7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030" Jan 22 14:35:50 crc kubenswrapper[4743]: I0122 14:35:50.006407 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030"} err="failed to get container status \"7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030\": rpc error: code = NotFound desc = could not find container \"7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030\": container with ID starting with 7ee45dd9ec01dc1ff8d555b554472056e5ec09cf19849629dff0ae94def60030 not found: ID does not exist" Jan 22 14:35:51 crc kubenswrapper[4743]: I0122 14:35:51.759893 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" path="/var/lib/kubelet/pods/84ea4890-26cb-4257-acad-40f72d6dc19e/volumes" Jan 22 14:35:58 crc kubenswrapper[4743]: I0122 14:35:58.747707 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:35:58 crc kubenswrapper[4743]: E0122 14:35:58.748610 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:36:09 crc kubenswrapper[4743]: I0122 14:36:09.747433 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:36:09 crc kubenswrapper[4743]: E0122 14:36:09.748363 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:36:22 crc kubenswrapper[4743]: I0122 14:36:22.747465 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:36:22 crc kubenswrapper[4743]: E0122 14:36:22.748239 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:36:36 crc kubenswrapper[4743]: I0122 14:36:36.748877 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:36:36 crc kubenswrapper[4743]: E0122 14:36:36.750188 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:36:48 crc kubenswrapper[4743]: I0122 14:36:48.748061 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:36:48 crc kubenswrapper[4743]: E0122 14:36:48.748864 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:37:02 crc kubenswrapper[4743]: I0122 14:37:02.747667 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:37:02 crc kubenswrapper[4743]: E0122 14:37:02.748582 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:37:15 crc kubenswrapper[4743]: I0122 14:37:15.747179 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:37:15 crc kubenswrapper[4743]: E0122 14:37:15.748020 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:37:26 crc kubenswrapper[4743]: I0122 14:37:26.747647 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:37:26 crc kubenswrapper[4743]: E0122 14:37:26.748699 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:37:37 crc kubenswrapper[4743]: I0122 14:37:37.748494 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:37:37 crc kubenswrapper[4743]: E0122 14:37:37.749837 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:37:52 crc kubenswrapper[4743]: I0122 14:37:52.748261 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:37:52 crc kubenswrapper[4743]: E0122 14:37:52.749743 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:38:04 crc kubenswrapper[4743]: I0122 14:38:04.747918 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:38:04 crc kubenswrapper[4743]: E0122 14:38:04.750007 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:38:15 crc kubenswrapper[4743]: I0122 14:38:15.748388 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:38:15 crc kubenswrapper[4743]: E0122 14:38:15.749163 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:38:28 crc kubenswrapper[4743]: I0122 14:38:28.747730 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:38:28 crc kubenswrapper[4743]: E0122 14:38:28.748536 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:38:40 crc kubenswrapper[4743]: I0122 14:38:40.747396 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:38:41 crc kubenswrapper[4743]: I0122 14:38:41.491857 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe"} Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.017951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:31 crc kubenswrapper[4743]: E0122 14:39:31.018833 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="registry-server" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.018845 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="registry-server" Jan 22 14:39:31 crc kubenswrapper[4743]: E0122 14:39:31.018858 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="extract-content" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.018865 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="extract-content" Jan 22 14:39:31 crc kubenswrapper[4743]: E0122 14:39:31.018884 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="extract-utilities" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.018892 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="extract-utilities" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.019074 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ea4890-26cb-4257-acad-40f72d6dc19e" containerName="registry-server" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.021180 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.040780 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.060207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.060321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.060380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hw5v\" (UniqueName: \"kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.161970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.162050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hw5v\" (UniqueName: \"kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.162139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.162497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.162546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.196975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hw5v\" (UniqueName: \"kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v\") pod \"redhat-operators-ptbb9\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.347745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:31 crc kubenswrapper[4743]: I0122 14:39:31.832306 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:32 crc kubenswrapper[4743]: I0122 14:39:32.048124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerStarted","Data":"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375"} Jan 22 14:39:32 crc kubenswrapper[4743]: I0122 14:39:32.048589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerStarted","Data":"2aadf876d698d3ebfc221975d716ea1294ee2464909df6794176a340ff6f3ce6"} Jan 22 14:39:33 crc kubenswrapper[4743]: I0122 14:39:33.058881 4743 generic.go:334] "Generic (PLEG): container finished" podID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerID="3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375" exitCode=0 Jan 22 14:39:33 crc kubenswrapper[4743]: I0122 14:39:33.058996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerDied","Data":"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375"} Jan 22 14:39:33 crc kubenswrapper[4743]: I0122 14:39:33.062266 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:39:34 crc kubenswrapper[4743]: I0122 14:39:34.072703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerStarted","Data":"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1"} Jan 22 14:39:37 crc kubenswrapper[4743]: I0122 14:39:37.114448 4743 generic.go:334] "Generic (PLEG): container finished" podID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerID="a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1" exitCode=0 Jan 22 14:39:37 crc kubenswrapper[4743]: I0122 14:39:37.114523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerDied","Data":"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1"} Jan 22 14:39:39 crc kubenswrapper[4743]: I0122 14:39:39.142041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerStarted","Data":"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a"} Jan 22 14:39:39 crc kubenswrapper[4743]: I0122 14:39:39.163502 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ptbb9" podStartSLOduration=4.289952916 podStartE2EDuration="9.16347979s" podCreationTimestamp="2026-01-22 14:39:30 +0000 UTC" firstStartedPulling="2026-01-22 14:39:33.06188239 +0000 UTC m=+3209.616925543" lastFinishedPulling="2026-01-22 14:39:37.935409254 +0000 UTC m=+3214.490452417" observedRunningTime="2026-01-22 14:39:39.16271262 +0000 UTC m=+3215.717755783" watchObservedRunningTime="2026-01-22 14:39:39.16347979 +0000 UTC m=+3215.718522963" Jan 22 14:39:41 crc kubenswrapper[4743]: I0122 14:39:41.349064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:41 crc kubenswrapper[4743]: I0122 14:39:41.351567 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:42 crc kubenswrapper[4743]: I0122 14:39:42.403941 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ptbb9" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="registry-server" probeResult="failure" output=< Jan 22 14:39:42 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Jan 22 14:39:42 crc kubenswrapper[4743]: > Jan 22 14:39:51 crc kubenswrapper[4743]: I0122 14:39:51.415601 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:51 crc kubenswrapper[4743]: I0122 14:39:51.477960 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:52 crc kubenswrapper[4743]: I0122 14:39:52.677233 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.280430 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ptbb9" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="registry-server" containerID="cri-o://108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a" gracePeriod=2 Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.895354 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.955193 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content\") pod \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.955238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hw5v\" (UniqueName: \"kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v\") pod \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.955264 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities\") pod \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\" (UID: \"4433d68e-d681-4d2d-9539-4a4a5f1e515f\") " Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.956390 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities" (OuterVolumeSpecName: "utilities") pod "4433d68e-d681-4d2d-9539-4a4a5f1e515f" (UID: "4433d68e-d681-4d2d-9539-4a4a5f1e515f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:39:53 crc kubenswrapper[4743]: I0122 14:39:53.964022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v" (OuterVolumeSpecName: "kube-api-access-4hw5v") pod "4433d68e-d681-4d2d-9539-4a4a5f1e515f" (UID: "4433d68e-d681-4d2d-9539-4a4a5f1e515f"). InnerVolumeSpecName "kube-api-access-4hw5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.057541 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hw5v\" (UniqueName: \"kubernetes.io/projected/4433d68e-d681-4d2d-9539-4a4a5f1e515f-kube-api-access-4hw5v\") on node \"crc\" DevicePath \"\"" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.057948 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.080968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4433d68e-d681-4d2d-9539-4a4a5f1e515f" (UID: "4433d68e-d681-4d2d-9539-4a4a5f1e515f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.159533 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4433d68e-d681-4d2d-9539-4a4a5f1e515f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.292025 4743 generic.go:334] "Generic (PLEG): container finished" podID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerID="108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a" exitCode=0 Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.292082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerDied","Data":"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a"} Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.292153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ptbb9" event={"ID":"4433d68e-d681-4d2d-9539-4a4a5f1e515f","Type":"ContainerDied","Data":"2aadf876d698d3ebfc221975d716ea1294ee2464909df6794176a340ff6f3ce6"} Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.292184 4743 scope.go:117] "RemoveContainer" containerID="108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.292176 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ptbb9" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.328556 4743 scope.go:117] "RemoveContainer" containerID="a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.339090 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.357876 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ptbb9"] Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.364230 4743 scope.go:117] "RemoveContainer" containerID="3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.411597 4743 scope.go:117] "RemoveContainer" containerID="108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a" Jan 22 14:39:54 crc kubenswrapper[4743]: E0122 14:39:54.413586 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a\": container with ID starting with 108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a not found: ID does not exist" containerID="108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.413620 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a"} err="failed to get container status \"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a\": rpc error: code = NotFound desc = could not find container \"108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a\": container with ID starting with 108bc65faff25725c008837a1b2dc25e9a6a36e33573babf342ea954d280293a not found: ID does not exist" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.413644 4743 scope.go:117] "RemoveContainer" containerID="a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1" Jan 22 14:39:54 crc kubenswrapper[4743]: E0122 14:39:54.414309 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1\": container with ID starting with a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1 not found: ID does not exist" containerID="a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.414344 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1"} err="failed to get container status \"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1\": rpc error: code = NotFound desc = could not find container \"a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1\": container with ID starting with a640e615f48085181765d38b32f4fe6b88e20400e70d8a429f25c1df89466ff1 not found: ID does not exist" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.414364 4743 scope.go:117] "RemoveContainer" containerID="3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375" Jan 22 14:39:54 crc kubenswrapper[4743]: E0122 14:39:54.414715 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375\": container with ID starting with 3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375 not found: ID does not exist" containerID="3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375" Jan 22 14:39:54 crc kubenswrapper[4743]: I0122 14:39:54.414757 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375"} err="failed to get container status \"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375\": rpc error: code = NotFound desc = could not find container \"3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375\": container with ID starting with 3def07263c054ffec2b12dc9d47e58625ae9b17c65b8efc49664ad019ce53375 not found: ID does not exist" Jan 22 14:39:55 crc kubenswrapper[4743]: I0122 14:39:55.757118 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" path="/var/lib/kubelet/pods/4433d68e-d681-4d2d-9539-4a4a5f1e515f/volumes" Jan 22 14:41:00 crc kubenswrapper[4743]: I0122 14:41:00.049379 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:41:00 crc kubenswrapper[4743]: I0122 14:41:00.049985 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:41:30 crc kubenswrapper[4743]: I0122 14:41:30.049139 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:41:30 crc kubenswrapper[4743]: I0122 14:41:30.049608 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.050072 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.050821 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.050892 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.051990 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.052099 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe" gracePeriod=600 Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.600236 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe" exitCode=0 Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.600553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe"} Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.600581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294"} Jan 22 14:42:00 crc kubenswrapper[4743]: I0122 14:42:00.600608 4743 scope.go:117] "RemoveContainer" containerID="08d50f3e8a1fa4764eac2002ae36a086093cd35040b1458273d108d4bfc0e111" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.449100 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:03 crc kubenswrapper[4743]: E0122 14:43:03.453555 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="extract-content" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.453607 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="extract-content" Jan 22 14:43:03 crc kubenswrapper[4743]: E0122 14:43:03.453679 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="extract-utilities" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.453693 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="extract-utilities" Jan 22 14:43:03 crc kubenswrapper[4743]: E0122 14:43:03.453757 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="registry-server" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.453769 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="registry-server" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.455627 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4433d68e-d681-4d2d-9539-4a4a5f1e515f" containerName="registry-server" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.463167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.507214 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.562628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.563275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.563370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6tg\" (UniqueName: \"kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.665341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.665522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.665547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6tg\" (UniqueName: \"kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.666218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.666220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.688456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6tg\" (UniqueName: \"kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg\") pod \"community-operators-tf6q2\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:03 crc kubenswrapper[4743]: I0122 14:43:03.799746 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:04 crc kubenswrapper[4743]: I0122 14:43:04.413033 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:05 crc kubenswrapper[4743]: I0122 14:43:05.329846 4743 generic.go:334] "Generic (PLEG): container finished" podID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerID="4627042ed2a3c86a757a8cf94cb92b826bec8cdc8cee3ebf8309695c45e6ae39" exitCode=0 Jan 22 14:43:05 crc kubenswrapper[4743]: I0122 14:43:05.330037 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerDied","Data":"4627042ed2a3c86a757a8cf94cb92b826bec8cdc8cee3ebf8309695c45e6ae39"} Jan 22 14:43:05 crc kubenswrapper[4743]: I0122 14:43:05.330382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerStarted","Data":"8716d873ce016104ee2c81851d99ba0ec669e8b265b81842746bb9e5168bb494"} Jan 22 14:43:06 crc kubenswrapper[4743]: I0122 14:43:06.342537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerStarted","Data":"c45fb5341f9479d2d235e133c204d1d6d25ce83dae36fdf9316a99a6fb5a774d"} Jan 22 14:43:07 crc kubenswrapper[4743]: I0122 14:43:07.361611 4743 generic.go:334] "Generic (PLEG): container finished" podID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerID="c45fb5341f9479d2d235e133c204d1d6d25ce83dae36fdf9316a99a6fb5a774d" exitCode=0 Jan 22 14:43:07 crc kubenswrapper[4743]: I0122 14:43:07.361730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerDied","Data":"c45fb5341f9479d2d235e133c204d1d6d25ce83dae36fdf9316a99a6fb5a774d"} Jan 22 14:43:08 crc kubenswrapper[4743]: I0122 14:43:08.378323 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerStarted","Data":"a111765305cb49843cabc0420a43fc5c9d710c4c5568edc0c6e610947c04855d"} Jan 22 14:43:08 crc kubenswrapper[4743]: I0122 14:43:08.409371 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tf6q2" podStartSLOduration=2.74513756 podStartE2EDuration="5.409345011s" podCreationTimestamp="2026-01-22 14:43:03 +0000 UTC" firstStartedPulling="2026-01-22 14:43:05.336895436 +0000 UTC m=+3421.891938599" lastFinishedPulling="2026-01-22 14:43:08.001102887 +0000 UTC m=+3424.556146050" observedRunningTime="2026-01-22 14:43:08.403915298 +0000 UTC m=+3424.958958541" watchObservedRunningTime="2026-01-22 14:43:08.409345011 +0000 UTC m=+3424.964388174" Jan 22 14:43:13 crc kubenswrapper[4743]: I0122 14:43:13.800579 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:13 crc kubenswrapper[4743]: I0122 14:43:13.801596 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:13 crc kubenswrapper[4743]: I0122 14:43:13.871297 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:14 crc kubenswrapper[4743]: I0122 14:43:14.520390 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:14 crc kubenswrapper[4743]: I0122 14:43:14.598699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:16 crc kubenswrapper[4743]: I0122 14:43:16.465669 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tf6q2" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="registry-server" containerID="cri-o://a111765305cb49843cabc0420a43fc5c9d710c4c5568edc0c6e610947c04855d" gracePeriod=2 Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.488603 4743 generic.go:334] "Generic (PLEG): container finished" podID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerID="a111765305cb49843cabc0420a43fc5c9d710c4c5568edc0c6e610947c04855d" exitCode=0 Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.488914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerDied","Data":"a111765305cb49843cabc0420a43fc5c9d710c4c5568edc0c6e610947c04855d"} Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.773894 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.794348 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities\") pod \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.794431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6tg\" (UniqueName: \"kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg\") pod \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.794498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content\") pod \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\" (UID: \"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8\") " Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.796157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities" (OuterVolumeSpecName: "utilities") pod "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" (UID: "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.836484 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg" (OuterVolumeSpecName: "kube-api-access-9l6tg") pod "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" (UID: "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8"). InnerVolumeSpecName "kube-api-access-9l6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.888660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" (UID: "5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.897336 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.897389 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6tg\" (UniqueName: \"kubernetes.io/projected/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-kube-api-access-9l6tg\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:17 crc kubenswrapper[4743]: I0122 14:43:17.897406 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.504204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf6q2" event={"ID":"5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8","Type":"ContainerDied","Data":"8716d873ce016104ee2c81851d99ba0ec669e8b265b81842746bb9e5168bb494"} Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.504302 4743 scope.go:117] "RemoveContainer" containerID="a111765305cb49843cabc0420a43fc5c9d710c4c5568edc0c6e610947c04855d" Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.504313 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf6q2" Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.549592 4743 scope.go:117] "RemoveContainer" containerID="c45fb5341f9479d2d235e133c204d1d6d25ce83dae36fdf9316a99a6fb5a774d" Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.570241 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.579997 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tf6q2"] Jan 22 14:43:18 crc kubenswrapper[4743]: I0122 14:43:18.596423 4743 scope.go:117] "RemoveContainer" containerID="4627042ed2a3c86a757a8cf94cb92b826bec8cdc8cee3ebf8309695c45e6ae39" Jan 22 14:43:19 crc kubenswrapper[4743]: I0122 14:43:19.765749 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" path="/var/lib/kubelet/pods/5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8/volumes" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.215044 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:40 crc kubenswrapper[4743]: E0122 14:43:40.215947 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="extract-utilities" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.215959 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="extract-utilities" Jan 22 14:43:40 crc kubenswrapper[4743]: E0122 14:43:40.215974 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="registry-server" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.215980 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="registry-server" Jan 22 14:43:40 crc kubenswrapper[4743]: E0122 14:43:40.215996 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="extract-content" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.216002 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="extract-content" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.216173 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a18d2bd-7a42-4f79-b83f-a2d3a19bebe8" containerName="registry-server" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.217360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.241161 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.331254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfql\" (UniqueName: \"kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.331405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.331644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.433841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfql\" (UniqueName: \"kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.434034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.434156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.434930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.435144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.477697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfql\" (UniqueName: \"kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql\") pod \"redhat-marketplace-thbl9\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:40 crc kubenswrapper[4743]: I0122 14:43:40.557475 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:41 crc kubenswrapper[4743]: I0122 14:43:41.112998 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:41 crc kubenswrapper[4743]: I0122 14:43:41.786634 4743 generic.go:334] "Generic (PLEG): container finished" podID="d405f3dc-ef78-4709-bb20-c74955562daa" containerID="a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac" exitCode=0 Jan 22 14:43:41 crc kubenswrapper[4743]: I0122 14:43:41.786730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerDied","Data":"a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac"} Jan 22 14:43:41 crc kubenswrapper[4743]: I0122 14:43:41.787100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerStarted","Data":"3db5d23769900d00edc0900fbeea37333b8358ffc6c55be165ac52430b22f351"} Jan 22 14:43:42 crc kubenswrapper[4743]: I0122 14:43:42.797873 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerStarted","Data":"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5"} Jan 22 14:43:43 crc kubenswrapper[4743]: I0122 14:43:43.813693 4743 generic.go:334] "Generic (PLEG): container finished" podID="d405f3dc-ef78-4709-bb20-c74955562daa" containerID="acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5" exitCode=0 Jan 22 14:43:43 crc kubenswrapper[4743]: I0122 14:43:43.813755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerDied","Data":"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5"} Jan 22 14:43:44 crc kubenswrapper[4743]: I0122 14:43:44.825436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerStarted","Data":"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a"} Jan 22 14:43:44 crc kubenswrapper[4743]: I0122 14:43:44.859684 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-thbl9" podStartSLOduration=2.426647029 podStartE2EDuration="4.859658349s" podCreationTimestamp="2026-01-22 14:43:40 +0000 UTC" firstStartedPulling="2026-01-22 14:43:41.789036401 +0000 UTC m=+3458.344079594" lastFinishedPulling="2026-01-22 14:43:44.222047761 +0000 UTC m=+3460.777090914" observedRunningTime="2026-01-22 14:43:44.845459044 +0000 UTC m=+3461.400502217" watchObservedRunningTime="2026-01-22 14:43:44.859658349 +0000 UTC m=+3461.414701522" Jan 22 14:43:46 crc kubenswrapper[4743]: I0122 14:43:46.844334 4743 generic.go:334] "Generic (PLEG): container finished" podID="dca0d9c1-5628-4b93-9696-f9d455c70f31" containerID="1568d211e2187ed9598521851ed4623ff5911d54748b07051f13edcc795b98a6" exitCode=0 Jan 22 14:43:46 crc kubenswrapper[4743]: I0122 14:43:46.844442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dca0d9c1-5628-4b93-9696-f9d455c70f31","Type":"ContainerDied","Data":"1568d211e2187ed9598521851ed4623ff5911d54748b07051f13edcc795b98a6"} Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.360046 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.513480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb4df\" (UniqueName: \"kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.514946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515175 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.515428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret\") pod \"dca0d9c1-5628-4b93-9696-f9d455c70f31\" (UID: \"dca0d9c1-5628-4b93-9696-f9d455c70f31\") " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.516137 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data" (OuterVolumeSpecName: "config-data") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.516715 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.519177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.522125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df" (OuterVolumeSpecName: "kube-api-access-lb4df") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "kube-api-access-lb4df". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.522970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.536014 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.550957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.563143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.574092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.575930 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dca0d9c1-5628-4b93-9696-f9d455c70f31" (UID: "dca0d9c1-5628-4b93-9696-f9d455c70f31"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.619905 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.619956 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.619971 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb4df\" (UniqueName: \"kubernetes.io/projected/dca0d9c1-5628-4b93-9696-f9d455c70f31-kube-api-access-lb4df\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.620047 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.620061 4743 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.620071 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dca0d9c1-5628-4b93-9696-f9d455c70f31-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.620081 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dca0d9c1-5628-4b93-9696-f9d455c70f31-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.625973 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dca0d9c1-5628-4b93-9696-f9d455c70f31-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.651238 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.728434 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.871076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dca0d9c1-5628-4b93-9696-f9d455c70f31","Type":"ContainerDied","Data":"5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254"} Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.871118 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e13e72ced949c2b8ac232dd4fa71a73aa63dba1ceb268e890a7eea4c9e7b254" Jan 22 14:43:48 crc kubenswrapper[4743]: I0122 14:43:48.871246 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 22 14:43:50 crc kubenswrapper[4743]: I0122 14:43:50.557920 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:50 crc kubenswrapper[4743]: I0122 14:43:50.558291 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:50 crc kubenswrapper[4743]: I0122 14:43:50.607838 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:50 crc kubenswrapper[4743]: I0122 14:43:50.960857 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:52 crc kubenswrapper[4743]: I0122 14:43:52.400151 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:52 crc kubenswrapper[4743]: I0122 14:43:52.922065 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-thbl9" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="registry-server" containerID="cri-o://f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a" gracePeriod=2 Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.392646 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.540352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities\") pod \"d405f3dc-ef78-4709-bb20-c74955562daa\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.540489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrfql\" (UniqueName: \"kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql\") pod \"d405f3dc-ef78-4709-bb20-c74955562daa\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.540551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content\") pod \"d405f3dc-ef78-4709-bb20-c74955562daa\" (UID: \"d405f3dc-ef78-4709-bb20-c74955562daa\") " Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.542461 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities" (OuterVolumeSpecName: "utilities") pod "d405f3dc-ef78-4709-bb20-c74955562daa" (UID: "d405f3dc-ef78-4709-bb20-c74955562daa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.551007 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql" (OuterVolumeSpecName: "kube-api-access-mrfql") pod "d405f3dc-ef78-4709-bb20-c74955562daa" (UID: "d405f3dc-ef78-4709-bb20-c74955562daa"). InnerVolumeSpecName "kube-api-access-mrfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.567775 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d405f3dc-ef78-4709-bb20-c74955562daa" (UID: "d405f3dc-ef78-4709-bb20-c74955562daa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.644212 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrfql\" (UniqueName: \"kubernetes.io/projected/d405f3dc-ef78-4709-bb20-c74955562daa-kube-api-access-mrfql\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.644258 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.644271 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d405f3dc-ef78-4709-bb20-c74955562daa-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.932842 4743 generic.go:334] "Generic (PLEG): container finished" podID="d405f3dc-ef78-4709-bb20-c74955562daa" containerID="f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a" exitCode=0 Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.932906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerDied","Data":"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a"} Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.932948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thbl9" event={"ID":"d405f3dc-ef78-4709-bb20-c74955562daa","Type":"ContainerDied","Data":"3db5d23769900d00edc0900fbeea37333b8358ffc6c55be165ac52430b22f351"} Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.932956 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thbl9" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.932989 4743 scope.go:117] "RemoveContainer" containerID="f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.968383 4743 scope.go:117] "RemoveContainer" containerID="acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.987420 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.996306 4743 scope.go:117] "RemoveContainer" containerID="a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac" Jan 22 14:43:53 crc kubenswrapper[4743]: I0122 14:43:53.998971 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-thbl9"] Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.040981 4743 scope.go:117] "RemoveContainer" containerID="f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a" Jan 22 14:43:54 crc kubenswrapper[4743]: E0122 14:43:54.041429 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a\": container with ID starting with f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a not found: ID does not exist" containerID="f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a" Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.041469 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a"} err="failed to get container status \"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a\": rpc error: code = NotFound desc = could not find container \"f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a\": container with ID starting with f02375a535beffce7bd1ceb377a4d61ed347dd57af81bb9de8237e9cc8695b9a not found: ID does not exist" Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.041495 4743 scope.go:117] "RemoveContainer" containerID="acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5" Jan 22 14:43:54 crc kubenswrapper[4743]: E0122 14:43:54.041985 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5\": container with ID starting with acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5 not found: ID does not exist" containerID="acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5" Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.042132 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5"} err="failed to get container status \"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5\": rpc error: code = NotFound desc = could not find container \"acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5\": container with ID starting with acaaee71d774d0915f58294d2af5df18387d8556cd83579a8fa7ebfb65cd41f5 not found: ID does not exist" Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.042263 4743 scope.go:117] "RemoveContainer" containerID="a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac" Jan 22 14:43:54 crc kubenswrapper[4743]: E0122 14:43:54.042754 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac\": container with ID starting with a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac not found: ID does not exist" containerID="a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac" Jan 22 14:43:54 crc kubenswrapper[4743]: I0122 14:43:54.042864 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac"} err="failed to get container status \"a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac\": rpc error: code = NotFound desc = could not find container \"a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac\": container with ID starting with a7abfa64b04988bee7ade2744b7734d98eb11fb357459c0e9ddca8ca6619b6ac not found: ID does not exist" Jan 22 14:43:55 crc kubenswrapper[4743]: I0122 14:43:55.760349 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" path="/var/lib/kubelet/pods/d405f3dc-ef78-4709-bb20-c74955562daa/volumes" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.049075 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.050161 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.625230 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 22 14:44:00 crc kubenswrapper[4743]: E0122 14:44:00.625756 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="registry-server" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.625776 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="registry-server" Jan 22 14:44:00 crc kubenswrapper[4743]: E0122 14:44:00.625817 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="extract-content" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.625824 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="extract-content" Jan 22 14:44:00 crc kubenswrapper[4743]: E0122 14:44:00.625838 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca0d9c1-5628-4b93-9696-f9d455c70f31" containerName="tempest-tests-tempest-tests-runner" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.625846 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca0d9c1-5628-4b93-9696-f9d455c70f31" containerName="tempest-tests-tempest-tests-runner" Jan 22 14:44:00 crc kubenswrapper[4743]: E0122 14:44:00.625858 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="extract-utilities" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.625864 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="extract-utilities" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.626070 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d405f3dc-ef78-4709-bb20-c74955562daa" containerName="registry-server" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.626091 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca0d9c1-5628-4b93-9696-f9d455c70f31" containerName="tempest-tests-tempest-tests-runner" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.626725 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.630078 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8whlt" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.649528 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.822387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kcg\" (UniqueName: \"kubernetes.io/projected/058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7-kube-api-access-j7kcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.823450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.925826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.926014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kcg\" (UniqueName: \"kubernetes.io/projected/058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7-kube-api-access-j7kcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.927005 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.954434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kcg\" (UniqueName: \"kubernetes.io/projected/058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7-kube-api-access-j7kcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:00 crc kubenswrapper[4743]: I0122 14:44:00.957370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:01 crc kubenswrapper[4743]: I0122 14:44:01.251744 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 22 14:44:01 crc kubenswrapper[4743]: I0122 14:44:01.791724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 22 14:44:02 crc kubenswrapper[4743]: I0122 14:44:02.046934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7","Type":"ContainerStarted","Data":"f8a4f2c76c4cef6e6002621ba820630d7f81fe07723b784b27dac3c2cf4afba6"} Jan 22 14:44:03 crc kubenswrapper[4743]: I0122 14:44:03.066255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7","Type":"ContainerStarted","Data":"175ce154c7ac6d18b174e05977e02f4218356107f1deb74cbf7a3dda63ab75a9"} Jan 22 14:44:03 crc kubenswrapper[4743]: I0122 14:44:03.099316 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.159499952 podStartE2EDuration="3.099279335s" podCreationTimestamp="2026-01-22 14:44:00 +0000 UTC" firstStartedPulling="2026-01-22 14:44:01.799930743 +0000 UTC m=+3478.354973906" lastFinishedPulling="2026-01-22 14:44:02.739710126 +0000 UTC m=+3479.294753289" observedRunningTime="2026-01-22 14:44:03.089350563 +0000 UTC m=+3479.644393766" watchObservedRunningTime="2026-01-22 14:44:03.099279335 +0000 UTC m=+3479.654322538" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.222358 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gtw4/must-gather-lvsbh"] Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.225398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.228190 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7gtw4"/"default-dockercfg-szr5m" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.228819 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7gtw4"/"kube-root-ca.crt" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.228935 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7gtw4"/"openshift-service-ca.crt" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.242285 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7gtw4/must-gather-lvsbh"] Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.261023 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krv4c\" (UniqueName: \"kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.261094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.362594 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krv4c\" (UniqueName: \"kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.362674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.363150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.384155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krv4c\" (UniqueName: \"kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c\") pod \"must-gather-lvsbh\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:26 crc kubenswrapper[4743]: I0122 14:44:26.568860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:44:27 crc kubenswrapper[4743]: I0122 14:44:27.072683 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7gtw4/must-gather-lvsbh"] Jan 22 14:44:27 crc kubenswrapper[4743]: W0122 14:44:27.079471 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9762560_529f_41e5_8a82_5840434a3d10.slice/crio-31f4c3bf72425a857d2618c1afd3d41795e91f9b125f1aa5417d6e7ba63e1e7d WatchSource:0}: Error finding container 31f4c3bf72425a857d2618c1afd3d41795e91f9b125f1aa5417d6e7ba63e1e7d: Status 404 returned error can't find the container with id 31f4c3bf72425a857d2618c1afd3d41795e91f9b125f1aa5417d6e7ba63e1e7d Jan 22 14:44:27 crc kubenswrapper[4743]: I0122 14:44:27.358748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" event={"ID":"a9762560-529f-41e5-8a82-5840434a3d10","Type":"ContainerStarted","Data":"31f4c3bf72425a857d2618c1afd3d41795e91f9b125f1aa5417d6e7ba63e1e7d"} Jan 22 14:44:30 crc kubenswrapper[4743]: I0122 14:44:30.049340 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:44:30 crc kubenswrapper[4743]: I0122 14:44:30.049752 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:44:34 crc kubenswrapper[4743]: I0122 14:44:34.425403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" event={"ID":"a9762560-529f-41e5-8a82-5840434a3d10","Type":"ContainerStarted","Data":"a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c"} Jan 22 14:44:34 crc kubenswrapper[4743]: I0122 14:44:34.426017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" event={"ID":"a9762560-529f-41e5-8a82-5840434a3d10","Type":"ContainerStarted","Data":"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761"} Jan 22 14:44:34 crc kubenswrapper[4743]: I0122 14:44:34.450171 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" podStartSLOduration=1.8363207049999999 podStartE2EDuration="8.450148232s" podCreationTimestamp="2026-01-22 14:44:26 +0000 UTC" firstStartedPulling="2026-01-22 14:44:27.081901635 +0000 UTC m=+3503.636944808" lastFinishedPulling="2026-01-22 14:44:33.695729182 +0000 UTC m=+3510.250772335" observedRunningTime="2026-01-22 14:44:34.448951881 +0000 UTC m=+3511.003995054" watchObservedRunningTime="2026-01-22 14:44:34.450148232 +0000 UTC m=+3511.005191415" Jan 22 14:44:36 crc kubenswrapper[4743]: E0122 14:44:36.000917 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.53:48970->38.102.83.53:40709: write tcp 38.102.83.53:48970->38.102.83.53:40709: write: broken pipe Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.490299 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-hvcbs"] Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.492109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.527432 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.527829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf25z\" (UniqueName: \"kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.630511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.630934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf25z\" (UniqueName: \"kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.630670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.655623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf25z\" (UniqueName: \"kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z\") pod \"crc-debug-hvcbs\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.811901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:44:37 crc kubenswrapper[4743]: I0122 14:44:37.855593 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:44:38 crc kubenswrapper[4743]: I0122 14:44:38.462966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" event={"ID":"f0139920-2bb3-4f8a-8d3a-db4294122005","Type":"ContainerStarted","Data":"97f3fde350f130dfb5bfab4a94fa83eb0f303ba58e5b0a8e17259cc8b44ee276"} Jan 22 14:44:50 crc kubenswrapper[4743]: I0122 14:44:50.572549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" event={"ID":"f0139920-2bb3-4f8a-8d3a-db4294122005","Type":"ContainerStarted","Data":"73cf73c3dbc23986bf942cf4d091103df7f52bf2501631a7b1239a402710f91f"} Jan 22 14:44:50 crc kubenswrapper[4743]: I0122 14:44:50.593014 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" podStartSLOduration=1.7086185189999998 podStartE2EDuration="13.592985041s" podCreationTimestamp="2026-01-22 14:44:37 +0000 UTC" firstStartedPulling="2026-01-22 14:44:37.8553741 +0000 UTC m=+3514.410417263" lastFinishedPulling="2026-01-22 14:44:49.739740622 +0000 UTC m=+3526.294783785" observedRunningTime="2026-01-22 14:44:50.586943391 +0000 UTC m=+3527.141986554" watchObservedRunningTime="2026-01-22 14:44:50.592985041 +0000 UTC m=+3527.148028204" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.049467 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.049960 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.050016 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.050876 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.050953 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" gracePeriod=600 Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.145576 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9"] Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.147179 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.149388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.149582 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.154388 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9"] Jan 22 14:45:00 crc kubenswrapper[4743]: E0122 14:45:00.192100 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.290085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.290521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.290656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ns8\" (UniqueName: \"kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.392543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.392913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ns8\" (UniqueName: \"kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.393117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.393933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.404143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.419766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ns8\" (UniqueName: \"kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8\") pod \"collect-profiles-29484885-9sgf9\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.659225 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" exitCode=0 Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.659290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294"} Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.659349 4743 scope.go:117] "RemoveContainer" containerID="265e0dfd980847ed3c8f21827c507228156c0b4a6c42a6244b5fa0f74403c2fe" Jan 22 14:45:00 crc kubenswrapper[4743]: I0122 14:45:00.660100 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:45:00 crc kubenswrapper[4743]: E0122 14:45:00.660525 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:01 crc kubenswrapper[4743]: I0122 14:45:01.137733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:01 crc kubenswrapper[4743]: I0122 14:45:01.596724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9"] Jan 22 14:45:01 crc kubenswrapper[4743]: I0122 14:45:01.673757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" event={"ID":"c0e6f258-5399-4e7a-a35a-7a8c73832854","Type":"ContainerStarted","Data":"4dd6bd1a7f3a239743a137fac36c8027d7a731d54637655ac4c3657de90e5bdb"} Jan 22 14:45:02 crc kubenswrapper[4743]: I0122 14:45:02.683824 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0e6f258-5399-4e7a-a35a-7a8c73832854" containerID="f839b0e630a8d251af40f47cb6cd423f84c4b21ab42c1ab9b481b02bd49aa4b7" exitCode=0 Jan 22 14:45:02 crc kubenswrapper[4743]: I0122 14:45:02.683928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" event={"ID":"c0e6f258-5399-4e7a-a35a-7a8c73832854","Type":"ContainerDied","Data":"f839b0e630a8d251af40f47cb6cd423f84c4b21ab42c1ab9b481b02bd49aa4b7"} Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.014125 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.186103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume\") pod \"c0e6f258-5399-4e7a-a35a-7a8c73832854\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.186721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ns8\" (UniqueName: \"kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8\") pod \"c0e6f258-5399-4e7a-a35a-7a8c73832854\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.186765 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume\") pod \"c0e6f258-5399-4e7a-a35a-7a8c73832854\" (UID: \"c0e6f258-5399-4e7a-a35a-7a8c73832854\") " Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.187681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume" (OuterVolumeSpecName: "config-volume") pod "c0e6f258-5399-4e7a-a35a-7a8c73832854" (UID: "c0e6f258-5399-4e7a-a35a-7a8c73832854"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.203503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c0e6f258-5399-4e7a-a35a-7a8c73832854" (UID: "c0e6f258-5399-4e7a-a35a-7a8c73832854"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.204009 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8" (OuterVolumeSpecName: "kube-api-access-j2ns8") pod "c0e6f258-5399-4e7a-a35a-7a8c73832854" (UID: "c0e6f258-5399-4e7a-a35a-7a8c73832854"). InnerVolumeSpecName "kube-api-access-j2ns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.289193 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0e6f258-5399-4e7a-a35a-7a8c73832854-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.289229 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ns8\" (UniqueName: \"kubernetes.io/projected/c0e6f258-5399-4e7a-a35a-7a8c73832854-kube-api-access-j2ns8\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.289239 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0e6f258-5399-4e7a-a35a-7a8c73832854-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.702598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" event={"ID":"c0e6f258-5399-4e7a-a35a-7a8c73832854","Type":"ContainerDied","Data":"4dd6bd1a7f3a239743a137fac36c8027d7a731d54637655ac4c3657de90e5bdb"} Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.702664 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd6bd1a7f3a239743a137fac36c8027d7a731d54637655ac4c3657de90e5bdb" Jan 22 14:45:04 crc kubenswrapper[4743]: I0122 14:45:04.702758 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484885-9sgf9" Jan 22 14:45:05 crc kubenswrapper[4743]: I0122 14:45:05.101358 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6"] Jan 22 14:45:05 crc kubenswrapper[4743]: I0122 14:45:05.113191 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-d7hn6"] Jan 22 14:45:05 crc kubenswrapper[4743]: I0122 14:45:05.759827 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e831f897-bb92-4058-98de-256be3386b9f" path="/var/lib/kubelet/pods/e831f897-bb92-4058-98de-256be3386b9f/volumes" Jan 22 14:45:12 crc kubenswrapper[4743]: I0122 14:45:12.747850 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:45:12 crc kubenswrapper[4743]: E0122 14:45:12.748777 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:24 crc kubenswrapper[4743]: I0122 14:45:24.748149 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:45:24 crc kubenswrapper[4743]: E0122 14:45:24.749243 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:24 crc kubenswrapper[4743]: I0122 14:45:24.899816 4743 generic.go:334] "Generic (PLEG): container finished" podID="f0139920-2bb3-4f8a-8d3a-db4294122005" containerID="73cf73c3dbc23986bf942cf4d091103df7f52bf2501631a7b1239a402710f91f" exitCode=0 Jan 22 14:45:24 crc kubenswrapper[4743]: I0122 14:45:24.899901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" event={"ID":"f0139920-2bb3-4f8a-8d3a-db4294122005","Type":"ContainerDied","Data":"73cf73c3dbc23986bf942cf4d091103df7f52bf2501631a7b1239a402710f91f"} Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.011369 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.056899 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-hvcbs"] Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.065335 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-hvcbs"] Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.110499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf25z\" (UniqueName: \"kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z\") pod \"f0139920-2bb3-4f8a-8d3a-db4294122005\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.110629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host\") pod \"f0139920-2bb3-4f8a-8d3a-db4294122005\" (UID: \"f0139920-2bb3-4f8a-8d3a-db4294122005\") " Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.110740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host" (OuterVolumeSpecName: "host") pod "f0139920-2bb3-4f8a-8d3a-db4294122005" (UID: "f0139920-2bb3-4f8a-8d3a-db4294122005"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.111161 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0139920-2bb3-4f8a-8d3a-db4294122005-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.117629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z" (OuterVolumeSpecName: "kube-api-access-tf25z") pod "f0139920-2bb3-4f8a-8d3a-db4294122005" (UID: "f0139920-2bb3-4f8a-8d3a-db4294122005"). InnerVolumeSpecName "kube-api-access-tf25z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.212639 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf25z\" (UniqueName: \"kubernetes.io/projected/f0139920-2bb3-4f8a-8d3a-db4294122005-kube-api-access-tf25z\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.935102 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f3fde350f130dfb5bfab4a94fa83eb0f303ba58e5b0a8e17259cc8b44ee276" Jan 22 14:45:26 crc kubenswrapper[4743]: I0122 14:45:26.935521 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-hvcbs" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.276539 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-sph9h"] Jan 22 14:45:27 crc kubenswrapper[4743]: E0122 14:45:27.277435 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6f258-5399-4e7a-a35a-7a8c73832854" containerName="collect-profiles" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.277455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6f258-5399-4e7a-a35a-7a8c73832854" containerName="collect-profiles" Jan 22 14:45:27 crc kubenswrapper[4743]: E0122 14:45:27.277488 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0139920-2bb3-4f8a-8d3a-db4294122005" containerName="container-00" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.277497 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0139920-2bb3-4f8a-8d3a-db4294122005" containerName="container-00" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.277751 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0139920-2bb3-4f8a-8d3a-db4294122005" containerName="container-00" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.277778 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e6f258-5399-4e7a-a35a-7a8c73832854" containerName="collect-profiles" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.278490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.340570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8nc\" (UniqueName: \"kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.340639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.442331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8nc\" (UniqueName: \"kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.442411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.442581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.472669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8nc\" (UniqueName: \"kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc\") pod \"crc-debug-sph9h\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.601832 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.758182 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0139920-2bb3-4f8a-8d3a-db4294122005" path="/var/lib/kubelet/pods/f0139920-2bb3-4f8a-8d3a-db4294122005/volumes" Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.950068 4743 generic.go:334] "Generic (PLEG): container finished" podID="eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" containerID="a3654838f12bef1190cf951bc8e84d38134eec5d23fe0a7b3c4d50cd337a9543" exitCode=0 Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.950133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" event={"ID":"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8","Type":"ContainerDied","Data":"a3654838f12bef1190cf951bc8e84d38134eec5d23fe0a7b3c4d50cd337a9543"} Jan 22 14:45:27 crc kubenswrapper[4743]: I0122 14:45:27.950162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" event={"ID":"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8","Type":"ContainerStarted","Data":"b3c2d14945eb9beea54f6156ac1317f62a6a6d3c76cf0c088687ed4771d04c8b"} Jan 22 14:45:28 crc kubenswrapper[4743]: I0122 14:45:28.401892 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-sph9h"] Jan 22 14:45:28 crc kubenswrapper[4743]: I0122 14:45:28.410028 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-sph9h"] Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.059449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.177983 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host\") pod \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.178158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8nc\" (UniqueName: \"kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc\") pod \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\" (UID: \"eebaa97e-85bb-4314-a4ff-68dcc50a3fa8\") " Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.178386 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host" (OuterVolumeSpecName: "host") pod "eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" (UID: "eebaa97e-85bb-4314-a4ff-68dcc50a3fa8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.178919 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.185109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc" (OuterVolumeSpecName: "kube-api-access-8b8nc") pod "eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" (UID: "eebaa97e-85bb-4314-a4ff-68dcc50a3fa8"). InnerVolumeSpecName "kube-api-access-8b8nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.281016 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8nc\" (UniqueName: \"kubernetes.io/projected/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8-kube-api-access-8b8nc\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.571941 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-jbbfs"] Jan 22 14:45:29 crc kubenswrapper[4743]: E0122 14:45:29.572308 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" containerName="container-00" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.572320 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" containerName="container-00" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.572490 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" containerName="container-00" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.573077 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.687695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ktlh\" (UniqueName: \"kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.687754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.759991 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eebaa97e-85bb-4314-a4ff-68dcc50a3fa8" path="/var/lib/kubelet/pods/eebaa97e-85bb-4314-a4ff-68dcc50a3fa8/volumes" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.790405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ktlh\" (UniqueName: \"kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.791240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.791328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.817954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ktlh\" (UniqueName: \"kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh\") pod \"crc-debug-jbbfs\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.888184 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:29 crc kubenswrapper[4743]: W0122 14:45:29.934690 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cde97ff_b7cb_453b_8bb2_94d955b5034f.slice/crio-9a86f07e24a25caba77f64acff27f4ca98f980f08a4470b81645eb84ad767a22 WatchSource:0}: Error finding container 9a86f07e24a25caba77f64acff27f4ca98f980f08a4470b81645eb84ad767a22: Status 404 returned error can't find the container with id 9a86f07e24a25caba77f64acff27f4ca98f980f08a4470b81645eb84ad767a22 Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.970694 4743 scope.go:117] "RemoveContainer" containerID="a3654838f12bef1190cf951bc8e84d38134eec5d23fe0a7b3c4d50cd337a9543" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.970737 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-sph9h" Jan 22 14:45:29 crc kubenswrapper[4743]: I0122 14:45:29.975614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" event={"ID":"8cde97ff-b7cb-453b-8bb2-94d955b5034f","Type":"ContainerStarted","Data":"9a86f07e24a25caba77f64acff27f4ca98f980f08a4470b81645eb84ad767a22"} Jan 22 14:45:30 crc kubenswrapper[4743]: I0122 14:45:30.987143 4743 generic.go:334] "Generic (PLEG): container finished" podID="8cde97ff-b7cb-453b-8bb2-94d955b5034f" containerID="073f5802ad2e942bfeab8cd3a529f061b078a8595f17ffdb0c513849167f7642" exitCode=0 Jan 22 14:45:30 crc kubenswrapper[4743]: I0122 14:45:30.987250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" event={"ID":"8cde97ff-b7cb-453b-8bb2-94d955b5034f","Type":"ContainerDied","Data":"073f5802ad2e942bfeab8cd3a529f061b078a8595f17ffdb0c513849167f7642"} Jan 22 14:45:31 crc kubenswrapper[4743]: I0122 14:45:31.031896 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-jbbfs"] Jan 22 14:45:31 crc kubenswrapper[4743]: I0122 14:45:31.041283 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gtw4/crc-debug-jbbfs"] Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.106367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.235042 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ktlh\" (UniqueName: \"kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh\") pod \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.235193 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host\") pod \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\" (UID: \"8cde97ff-b7cb-453b-8bb2-94d955b5034f\") " Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.235308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host" (OuterVolumeSpecName: "host") pod "8cde97ff-b7cb-453b-8bb2-94d955b5034f" (UID: "8cde97ff-b7cb-453b-8bb2-94d955b5034f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.235686 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cde97ff-b7cb-453b-8bb2-94d955b5034f-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.260652 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh" (OuterVolumeSpecName: "kube-api-access-4ktlh") pod "8cde97ff-b7cb-453b-8bb2-94d955b5034f" (UID: "8cde97ff-b7cb-453b-8bb2-94d955b5034f"). InnerVolumeSpecName "kube-api-access-4ktlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:45:32 crc kubenswrapper[4743]: I0122 14:45:32.336963 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ktlh\" (UniqueName: \"kubernetes.io/projected/8cde97ff-b7cb-453b-8bb2-94d955b5034f-kube-api-access-4ktlh\") on node \"crc\" DevicePath \"\"" Jan 22 14:45:33 crc kubenswrapper[4743]: I0122 14:45:33.014668 4743 scope.go:117] "RemoveContainer" containerID="073f5802ad2e942bfeab8cd3a529f061b078a8595f17ffdb0c513849167f7642" Jan 22 14:45:33 crc kubenswrapper[4743]: I0122 14:45:33.014739 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/crc-debug-jbbfs" Jan 22 14:45:33 crc kubenswrapper[4743]: I0122 14:45:33.761528 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cde97ff-b7cb-453b-8bb2-94d955b5034f" path="/var/lib/kubelet/pods/8cde97ff-b7cb-453b-8bb2-94d955b5034f/volumes" Jan 22 14:45:35 crc kubenswrapper[4743]: I0122 14:45:35.463080 4743 scope.go:117] "RemoveContainer" containerID="3df41680a87ee87067e2b60fbe46d5e4249f256f97129553264ba47462e1e5fa" Jan 22 14:45:35 crc kubenswrapper[4743]: I0122 14:45:35.747610 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:45:35 crc kubenswrapper[4743]: E0122 14:45:35.748292 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:45 crc kubenswrapper[4743]: I0122 14:45:45.682188 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-64fcd75458-9rzfr_c4db7649-d1b0-47c2-b5e4-34a552ccee79/barbican-api/0.log" Jan 22 14:45:45 crc kubenswrapper[4743]: I0122 14:45:45.713874 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-64fcd75458-9rzfr_c4db7649-d1b0-47c2-b5e4-34a552ccee79/barbican-api-log/0.log" Jan 22 14:45:45 crc kubenswrapper[4743]: I0122 14:45:45.858930 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c88b6769d-nzzc6_a84fcd7a-0eac-4d23-832e-e632bd4f971f/barbican-keystone-listener/0.log" Jan 22 14:45:45 crc kubenswrapper[4743]: I0122 14:45:45.922933 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c88b6769d-nzzc6_a84fcd7a-0eac-4d23-832e-e632bd4f971f/barbican-keystone-listener-log/0.log" Jan 22 14:45:45 crc kubenswrapper[4743]: I0122 14:45:45.981762 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8448f7b79-pndf8_f254cb75-db18-488e-886f-544f0b8a8516/barbican-worker/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.071774 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8448f7b79-pndf8_f254cb75-db18-488e-886f-544f0b8a8516/barbican-worker-log/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.290298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6_33d8b498-a76a-4549-96c2-f32877beaa30/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.410206 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/ceilometer-central-agent/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.479399 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/ceilometer-notification-agent/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.489160 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/proxy-httpd/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.515530 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/sg-core/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.632298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d5aa29c0-68de-446c-aafd-50080e4adb51/cinder-api-log/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.692384 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d5aa29c0-68de-446c-aafd-50080e4adb51/cinder-api/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.747570 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:45:46 crc kubenswrapper[4743]: E0122 14:45:46.747902 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.863639 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ec6e51f6-2808-404d-8cf0-8c8b44c86cb9/probe/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.878054 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ec6e51f6-2808-404d-8cf0-8c8b44c86cb9/cinder-scheduler/0.log" Jan 22 14:45:46 crc kubenswrapper[4743]: I0122 14:45:46.915394 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xph97_89048557-6c94-40a8-aa26-c9d940743be9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.026665 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs_6cf93e6b-adef-48fb-844b-a420be87fd2e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.110919 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/init/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.312667 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/init/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.350635 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/dnsmasq-dns/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.365810 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz_6772da1b-97c0-4b18-af50-7723f5dc39b6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.515406 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5247bc1b-998e-4275-9f4a-d3c30ff488b9/glance-log/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.582988 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5247bc1b-998e-4275-9f4a-d3c30ff488b9/glance-httpd/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.703146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb/glance-httpd/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.728528 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb/glance-log/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.862810 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b7fb54dc6-5q9jf_e452af10-fc11-4854-bf38-8a90856331d3/horizon/0.log" Jan 22 14:45:47 crc kubenswrapper[4743]: I0122 14:45:47.980205 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt_beb6ea36-21b3-4658-a609-7ecace4d6efc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:48 crc kubenswrapper[4743]: I0122 14:45:48.241605 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b7fb54dc6-5q9jf_e452af10-fc11-4854-bf38-8a90856331d3/horizon-log/0.log" Jan 22 14:45:48 crc kubenswrapper[4743]: I0122 14:45:48.269457 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-z8mwg_9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:48 crc kubenswrapper[4743]: I0122 14:45:48.465426 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_492b4d6f-25ef-41b4-9aa8-876d9baaaf13/kube-state-metrics/0.log" Jan 22 14:45:48 crc kubenswrapper[4743]: I0122 14:45:48.495805 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5f47b7b66b-mfhcg_cd3df106-ec34-42ad-bf5d-f963b9bb0871/keystone-api/0.log" Jan 22 14:45:48 crc kubenswrapper[4743]: I0122 14:45:48.664603 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2_5dca488a-cb84-4610-bf38-0f4c65c8b94a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:49 crc kubenswrapper[4743]: I0122 14:45:49.030624 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dd566fb89-mgkw8_36675c4f-99e7-4cfb-a9c4-22519e8e7d4c/neutron-api/0.log" Jan 22 14:45:49 crc kubenswrapper[4743]: I0122 14:45:49.034730 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dd566fb89-mgkw8_36675c4f-99e7-4cfb-a9c4-22519e8e7d4c/neutron-httpd/0.log" Jan 22 14:45:49 crc kubenswrapper[4743]: I0122 14:45:49.279414 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9_eec02bb6-2380-4910-8e5a-1fe3196760a4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:49 crc kubenswrapper[4743]: I0122 14:45:49.757977 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_188bdbf9-2ed2-427b-99c1-7c435a25a3c6/nova-cell0-conductor-conductor/0.log" Jan 22 14:45:49 crc kubenswrapper[4743]: I0122 14:45:49.845510 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5e08ea55-209f-4956-b9cb-c261280252ad/nova-api-log/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:49.940320 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5e08ea55-209f-4956-b9cb-c261280252ad/nova-api-api/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.254126 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3b26ff8e-b36c-47d8-8d74-da49485ec363/nova-cell1-conductor-conductor/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.271777 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac/nova-cell1-novncproxy-novncproxy/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.346725 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jjllf_a546459d-e713-453e-adbd-c3b9f8c7b961/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.531770 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fc42f0d6-9224-404d-8584-2c0fec4f3edd/nova-metadata-log/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.854647 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/mysql-bootstrap/0.log" Jan 22 14:45:50 crc kubenswrapper[4743]: I0122 14:45:50.885936 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_529c10d9-fb76-4b45-8b08-3d9656bfdcd5/nova-scheduler-scheduler/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.030096 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/mysql-bootstrap/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.070751 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/galera/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.291650 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/mysql-bootstrap/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.463668 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/mysql-bootstrap/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.522603 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/galera/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.565287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:45:51 crc kubenswrapper[4743]: E0122 14:45:51.565704 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cde97ff-b7cb-453b-8bb2-94d955b5034f" containerName="container-00" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.565718 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cde97ff-b7cb-453b-8bb2-94d955b5034f" containerName="container-00" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.566023 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cde97ff-b7cb-453b-8bb2-94d955b5034f" containerName="container-00" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.567319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.578224 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.599462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fc42f0d6-9224-404d-8584-2c0fec4f3edd/nova-metadata-metadata/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.671974 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_41abc04c-e711-4e34-a0b0-085b7b09d94d/openstackclient/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.744042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.744417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srq5f\" (UniqueName: \"kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.744558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.793077 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m22h5_f3551792-b862-492e-8c36-e0a63cd4468f/ovn-controller/0.log" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.845838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.845945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srq5f\" (UniqueName: \"kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.845993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.847250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.848235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.882691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srq5f\" (UniqueName: \"kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f\") pod \"certified-operators-657r9\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:51 crc kubenswrapper[4743]: I0122 14:45:51.902440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.015564 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rzsrt_3450abf2-6cd6-4090-b26f-4d83e2a6ea2b/openstack-network-exporter/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.350588 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server-init/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.468226 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server-init/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.485319 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.546693 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.555886 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovs-vswitchd/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.769886 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nm84h_99d2edf2-043a-4066-9d64-36be28d2197d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.784716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c926afa-42b3-4fc2-bc38-8ee725cd113b/openstack-network-exporter/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.866024 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c926afa-42b3-4fc2-bc38-8ee725cd113b/ovn-northd/0.log" Jan 22 14:45:52 crc kubenswrapper[4743]: I0122 14:45:52.974155 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b07a577-785f-4720-919c-ef619448284a/openstack-network-exporter/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.018305 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b07a577-785f-4720-919c-ef619448284a/ovsdbserver-nb/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.145607 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_98d7b7d3-f576-4b98-912f-6e7aab2d295a/openstack-network-exporter/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.189313 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9f82082-5738-4558-90f0-19031c102c11" containerID="766d593f0d333066944436f10081773ec0c5b1fc73f7742a0790cfdca2160459" exitCode=0 Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.189366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerDied","Data":"766d593f0d333066944436f10081773ec0c5b1fc73f7742a0790cfdca2160459"} Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.189400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerStarted","Data":"04851a98608ba76d7502b7ae82bb714b0892daacf2dfe00c136c911ff11953a8"} Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.257339 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_98d7b7d3-f576-4b98-912f-6e7aab2d295a/ovsdbserver-sb/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.362381 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77bd86cd86-kqp9m_9315e9cf-2a73-482e-810e-8fd19202915f/placement-api/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.452002 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77bd86cd86-kqp9m_9315e9cf-2a73-482e-810e-8fd19202915f/placement-log/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.575119 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/setup-container/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.779835 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/setup-container/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.855627 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/setup-container/0.log" Jan 22 14:45:53 crc kubenswrapper[4743]: I0122 14:45:53.871303 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/rabbitmq/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.063040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/setup-container/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.134893 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/rabbitmq/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.158185 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp_010d8c84-1843-4e5c-85b8-b39df20a58fd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.197266 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9f82082-5738-4558-90f0-19031c102c11" containerID="551398bc952af11113b8e832bd9b399ad43c182b1f7fda37144d44c007d9a6b1" exitCode=0 Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.197319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerDied","Data":"551398bc952af11113b8e832bd9b399ad43c182b1f7fda37144d44c007d9a6b1"} Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.311590 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-nr24d_92bb5b08-555d-4d1b-b105-e7cf240f190b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.454129 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2_305fa257-7d41-4a05-ae4e-1b945894aa09/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.586325 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-f6dtw_62ef8bcc-609a-4fe6-a41d-48200e08b72f/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.716868 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jvd8s_c41f0818-52ad-4c25-82fa-61a14a9825a1/ssh-known-hosts-edpm-deployment/0.log" Jan 22 14:45:54 crc kubenswrapper[4743]: I0122 14:45:54.877239 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6df4ffc5-49vw4_33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5/proxy-server/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.060641 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6df4ffc5-49vw4_33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5/proxy-httpd/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.107294 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kg2fn_56dff5fb-e22c-4045-b3c4-c75e018df046/swift-ring-rebalance/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.206667 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerStarted","Data":"92922e5ed5e05ee4deea8ad47eeba77c35903309a6b7367a9d47d0fd55fedc11"} Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.231826 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-657r9" podStartSLOduration=2.717024259 podStartE2EDuration="4.231783096s" podCreationTimestamp="2026-01-22 14:45:51 +0000 UTC" firstStartedPulling="2026-01-22 14:45:53.191300714 +0000 UTC m=+3589.746343877" lastFinishedPulling="2026-01-22 14:45:54.706059551 +0000 UTC m=+3591.261102714" observedRunningTime="2026-01-22 14:45:55.229960968 +0000 UTC m=+3591.785004131" watchObservedRunningTime="2026-01-22 14:45:55.231783096 +0000 UTC m=+3591.786826259" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.291150 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-auditor/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.320986 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-reaper/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.351671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-replicator/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.397146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-server/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.566049 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-auditor/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.630672 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-server/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.632915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-replicator/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.652276 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-updater/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.857736 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-auditor/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.948981 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-replicator/0.log" Jan 22 14:45:55 crc kubenswrapper[4743]: I0122 14:45:55.986639 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-expirer/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.024452 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-server/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.152139 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-updater/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.224001 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/rsync/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.305841 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/swift-recon-cron/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.480811 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6_65113c72-73df-4a17-b923-60f9da824feb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.568397 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_dca0d9c1-5628-4b93-9696-f9d455c70f31/tempest-tests-tempest-tests-runner/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.672817 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7/test-operator-logs-container/0.log" Jan 22 14:45:56 crc kubenswrapper[4743]: I0122 14:45:56.845089 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-288lh_600f8b94-291e-4c03-b5d8-75f43de51d1d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:46:00 crc kubenswrapper[4743]: I0122 14:46:00.746780 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:46:00 crc kubenswrapper[4743]: E0122 14:46:00.747684 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:46:01 crc kubenswrapper[4743]: I0122 14:46:01.903193 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:01 crc kubenswrapper[4743]: I0122 14:46:01.903565 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:01 crc kubenswrapper[4743]: I0122 14:46:01.952050 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:02 crc kubenswrapper[4743]: I0122 14:46:02.333089 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:02 crc kubenswrapper[4743]: I0122 14:46:02.398758 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:46:04 crc kubenswrapper[4743]: I0122 14:46:04.296287 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-657r9" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="registry-server" containerID="cri-o://92922e5ed5e05ee4deea8ad47eeba77c35903309a6b7367a9d47d0fd55fedc11" gracePeriod=2 Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.280490 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_63d64b7b-89b2-468c-86e2-fe9de4338c0c/memcached/0.log" Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.307684 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9f82082-5738-4558-90f0-19031c102c11" containerID="92922e5ed5e05ee4deea8ad47eeba77c35903309a6b7367a9d47d0fd55fedc11" exitCode=0 Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.307747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerDied","Data":"92922e5ed5e05ee4deea8ad47eeba77c35903309a6b7367a9d47d0fd55fedc11"} Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.864131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.990962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content\") pod \"e9f82082-5738-4558-90f0-19031c102c11\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.991291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srq5f\" (UniqueName: \"kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f\") pod \"e9f82082-5738-4558-90f0-19031c102c11\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.991321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities\") pod \"e9f82082-5738-4558-90f0-19031c102c11\" (UID: \"e9f82082-5738-4558-90f0-19031c102c11\") " Jan 22 14:46:05 crc kubenswrapper[4743]: I0122 14:46:05.992180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities" (OuterVolumeSpecName: "utilities") pod "e9f82082-5738-4558-90f0-19031c102c11" (UID: "e9f82082-5738-4558-90f0-19031c102c11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.015130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f" (OuterVolumeSpecName: "kube-api-access-srq5f") pod "e9f82082-5738-4558-90f0-19031c102c11" (UID: "e9f82082-5738-4558-90f0-19031c102c11"). InnerVolumeSpecName "kube-api-access-srq5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.033783 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9f82082-5738-4558-90f0-19031c102c11" (UID: "e9f82082-5738-4558-90f0-19031c102c11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.098207 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srq5f\" (UniqueName: \"kubernetes.io/projected/e9f82082-5738-4558-90f0-19031c102c11-kube-api-access-srq5f\") on node \"crc\" DevicePath \"\"" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.098248 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.098260 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f82082-5738-4558-90f0-19031c102c11-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.323263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-657r9" event={"ID":"e9f82082-5738-4558-90f0-19031c102c11","Type":"ContainerDied","Data":"04851a98608ba76d7502b7ae82bb714b0892daacf2dfe00c136c911ff11953a8"} Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.323335 4743 scope.go:117] "RemoveContainer" containerID="92922e5ed5e05ee4deea8ad47eeba77c35903309a6b7367a9d47d0fd55fedc11" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.323368 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-657r9" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.350490 4743 scope.go:117] "RemoveContainer" containerID="551398bc952af11113b8e832bd9b399ad43c182b1f7fda37144d44c007d9a6b1" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.373117 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.377202 4743 scope.go:117] "RemoveContainer" containerID="766d593f0d333066944436f10081773ec0c5b1fc73f7742a0790cfdca2160459" Jan 22 14:46:06 crc kubenswrapper[4743]: I0122 14:46:06.383491 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-657r9"] Jan 22 14:46:07 crc kubenswrapper[4743]: I0122 14:46:07.760143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f82082-5738-4558-90f0-19031c102c11" path="/var/lib/kubelet/pods/e9f82082-5738-4558-90f0-19031c102c11/volumes" Jan 22 14:46:13 crc kubenswrapper[4743]: I0122 14:46:13.753373 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:46:13 crc kubenswrapper[4743]: E0122 14:46:13.754379 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:46:21 crc kubenswrapper[4743]: I0122 14:46:21.878344 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59dd8b7cbf-6kvwx_4eb53c43-8c71-4c15-862a-134fa6eb85d6/manager/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.035806 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-69cf5d4557-dn2mv_f6b9f418-b721-4fce-881e-791eceb6b0ef/manager/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.081710 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-w8s2s_aff36600-9c00-4a26-b311-a3d743333b0e/manager/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.269884 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.382574 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.407898 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.428158 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.641525 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.643960 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.683035 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/extract/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.874444 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-mr8bn_3454a999-851a-47d1-ba12-64f77de4bd6a/manager/0.log" Jan 22 14:46:22 crc kubenswrapper[4743]: I0122 14:46:22.900831 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-mxdkr_bad16498-5eda-4791-8577-6cf6ef07ca2a/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.073692 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-f7dhp_1fcc87bc-de60-44e2-b8b9-88c97eb2aec4/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.289763 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-69d6c9f5b8-nqr46_6221bb17-765b-4d72-8a74-70cdbc3447d9/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.354356 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54ccf4f85d-8xxtr_7df36228-9543-4bb1-a0a7-d2ca51ac35a5/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.389931 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-24n5k_3e262e2d-6d13-4c04-9826-14ed89dde8ea/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.532199 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-b77x5_96859e4c-bbb4-424b-bc02-2bd6e3b03484/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.635182 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-jtwg6_924b89fa-b3de-46d6-b9c8-5be5e6d4795c/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.762447 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5d8f59fb49-mhq65_064ac5cb-7d15-4502-b174-54236cdd0d51/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.889967 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6b8bc8d87d-mppfg_d1e52325-1801-4650-86bd-c1eb8f076714/manager/0.log" Jan 22 14:46:23 crc kubenswrapper[4743]: I0122 14:46:23.987008 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7bd9774b6-qfb2q_3b59a905-2607-4445-abee-ba43a1bdf41c/manager/0.log" Jan 22 14:46:24 crc kubenswrapper[4743]: I0122 14:46:24.110405 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854q5znv_4a00e91a-fbd8-496e-96e0-4fb25d7841fe/manager/0.log" Jan 22 14:46:24 crc kubenswrapper[4743]: I0122 14:46:24.245122 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6ddb855d8-zmpc6_80c8233a-0396-4d24-8212-53346af8d405/operator/0.log" Jan 22 14:46:24 crc kubenswrapper[4743]: I0122 14:46:24.420740 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dsm72_2ac18d53-2c89-4de9-8665-29d227f67a09/registry-server/0.log" Jan 22 14:46:24 crc kubenswrapper[4743]: I0122 14:46:24.880261 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5d646b7d76-wsnt4_ee1a4864-faf3-49da-ac1a-ab864c677803/manager/0.log" Jan 22 14:46:24 crc kubenswrapper[4743]: I0122 14:46:24.885677 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-995fd_6b4ae3c8-6f7f-4b76-91c0-3652f86422a6/manager/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.218001 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-nck8q_cf7c633b-f013-4be6-a794-888a816a2ec2/manager/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.256619 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jzmrn_36be36d1-45a3-4e18-ba83-e4ae61363409/operator/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.473954 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-b9bq2_d70fab64-d6ec-42ab-93ef-e882fc4d3f84/manager/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.478830 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85cd9769bb-srnbc_9d13aa32-eef2-427d-9398-507957b4c81c/manager/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.547528 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cdc5d4c7b-hk8dd_0855131d-976e-4cb5-83bb-9e47417d78f5/manager/0.log" Jan 22 14:46:25 crc kubenswrapper[4743]: I0122 14:46:25.616867 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5ffb9c6597-w4cch_1cd779f7-75c7-4a5b-82f1-15a26703ed29/manager/0.log" Jan 22 14:46:28 crc kubenswrapper[4743]: I0122 14:46:28.748138 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:46:28 crc kubenswrapper[4743]: E0122 14:46:28.748799 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:46:40 crc kubenswrapper[4743]: I0122 14:46:40.747032 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:46:40 crc kubenswrapper[4743]: E0122 14:46:40.747948 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:46:44 crc kubenswrapper[4743]: I0122 14:46:44.687095 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-25vhb_3a5126ab-15b9-4b80-ab92-de1b1af3d4a7/control-plane-machine-set-operator/0.log" Jan 22 14:46:44 crc kubenswrapper[4743]: I0122 14:46:44.860152 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g2ptk_b46225f4-dd80-45ae-9ffa-310527d770fc/kube-rbac-proxy/0.log" Jan 22 14:46:44 crc kubenswrapper[4743]: I0122 14:46:44.879119 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g2ptk_b46225f4-dd80-45ae-9ffa-310527d770fc/machine-api-operator/0.log" Jan 22 14:46:52 crc kubenswrapper[4743]: I0122 14:46:52.747473 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:46:52 crc kubenswrapper[4743]: E0122 14:46:52.748395 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:46:58 crc kubenswrapper[4743]: I0122 14:46:58.555872 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-db7qx_47fd113c-6de2-4ad1-b307-2c9bcbdff0b8/cert-manager-controller/0.log" Jan 22 14:46:58 crc kubenswrapper[4743]: I0122 14:46:58.694530 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-s2ln7_760af996-e4d2-4507-9e19-a50aa50ceb8a/cert-manager-cainjector/0.log" Jan 22 14:46:58 crc kubenswrapper[4743]: I0122 14:46:58.739108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zdf5x_aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1/cert-manager-webhook/0.log" Jan 22 14:47:04 crc kubenswrapper[4743]: I0122 14:47:04.747111 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:47:04 crc kubenswrapper[4743]: E0122 14:47:04.748093 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:47:11 crc kubenswrapper[4743]: I0122 14:47:11.833462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-pvbjf_27e7180c-e024-4412-9840-ddeb074d70c8/nmstate-console-plugin/0.log" Jan 22 14:47:12 crc kubenswrapper[4743]: I0122 14:47:12.009490 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wpxjd_07e19a00-064c-401a-9c0c-4acd067e4e9e/nmstate-handler/0.log" Jan 22 14:47:12 crc kubenswrapper[4743]: I0122 14:47:12.082922 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5kgz5_f50a3b80-43ad-46a0-b124-0249185f922b/kube-rbac-proxy/0.log" Jan 22 14:47:12 crc kubenswrapper[4743]: I0122 14:47:12.131010 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5kgz5_f50a3b80-43ad-46a0-b124-0249185f922b/nmstate-metrics/0.log" Jan 22 14:47:12 crc kubenswrapper[4743]: I0122 14:47:12.276302 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-f88rf_60d8bf9b-3641-4cd7-a809-0f77d3fae035/nmstate-operator/0.log" Jan 22 14:47:12 crc kubenswrapper[4743]: I0122 14:47:12.294927 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-mtdch_33f98d3b-f0ea-45dd-8fca-d942067e31ad/nmstate-webhook/0.log" Jan 22 14:47:15 crc kubenswrapper[4743]: I0122 14:47:15.747593 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:47:15 crc kubenswrapper[4743]: E0122 14:47:15.748616 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:47:28 crc kubenswrapper[4743]: I0122 14:47:28.747225 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:47:28 crc kubenswrapper[4743]: E0122 14:47:28.748221 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.065178 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4kwcx_9a99dab2-57e0-4830-8dc7-1bf40627f408/kube-rbac-proxy/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.126776 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4kwcx_9a99dab2-57e0-4830-8dc7-1bf40627f408/controller/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.280017 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.406827 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.418210 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.456453 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.491317 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.651528 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.694099 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.703186 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.753676 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.874957 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.892742 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.918945 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:47:39 crc kubenswrapper[4743]: I0122 14:47:39.954742 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/controller/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.045380 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/frr-metrics/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.124223 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/kube-rbac-proxy/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.169009 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/kube-rbac-proxy-frr/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.305328 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/reloader/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.369187 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-hmp5v_fa2773b4-4a56-40e4-a2a9-6188bb40964f/frr-k8s-webhook-server/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.604333 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6494f4f8f8-zbvgv_dcd68957-0356-4eda-a65f-77e770aae844/manager/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.758406 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65d5b677d7-mdls4_fe771c71-01ce-4513-bfa8-2393f3f055f2/webhook-server/0.log" Jan 22 14:47:40 crc kubenswrapper[4743]: I0122 14:47:40.870977 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl9sw_f17b4fff-f244-477f-912d-c2e93321094e/kube-rbac-proxy/0.log" Jan 22 14:47:41 crc kubenswrapper[4743]: I0122 14:47:41.439528 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl9sw_f17b4fff-f244-477f-912d-c2e93321094e/speaker/0.log" Jan 22 14:47:41 crc kubenswrapper[4743]: I0122 14:47:41.475390 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/frr/0.log" Jan 22 14:47:41 crc kubenswrapper[4743]: I0122 14:47:41.747139 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:47:41 crc kubenswrapper[4743]: E0122 14:47:41.747829 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.475310 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.703762 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.755808 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.809213 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.944505 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.946100 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:47:53 crc kubenswrapper[4743]: I0122 14:47:53.990973 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/extract/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.153390 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.263261 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.287596 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.309974 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.469219 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.509720 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/extract/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.514477 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.628751 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.747521 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:47:54 crc kubenswrapper[4743]: E0122 14:47:54.747778 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.775353 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.796829 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.809621 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.966934 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:47:54 crc kubenswrapper[4743]: I0122 14:47:54.974926 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.169718 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.381498 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.423475 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.425491 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.536490 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/registry-server/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.619161 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.690296 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:47:55 crc kubenswrapper[4743]: I0122 14:47:55.869188 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6sdp5_ac4d223b-b4ca-485a-aa22-1fbdb0a3228e/marketplace-operator/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.025811 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.197626 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/registry-server/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.237462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.246325 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.251194 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.421601 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.421932 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.548687 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/registry-server/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.640915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.793989 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.795077 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.803126 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:47:56 crc kubenswrapper[4743]: I0122 14:47:56.998605 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:47:57 crc kubenswrapper[4743]: I0122 14:47:57.001533 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:47:57 crc kubenswrapper[4743]: I0122 14:47:57.396130 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/registry-server/0.log" Jan 22 14:48:08 crc kubenswrapper[4743]: I0122 14:48:08.747377 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:48:08 crc kubenswrapper[4743]: E0122 14:48:08.748284 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:48:22 crc kubenswrapper[4743]: I0122 14:48:22.747432 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:48:22 crc kubenswrapper[4743]: E0122 14:48:22.749170 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:48:35 crc kubenswrapper[4743]: I0122 14:48:35.747220 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:48:35 crc kubenswrapper[4743]: E0122 14:48:35.748323 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:48:50 crc kubenswrapper[4743]: I0122 14:48:50.747244 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:48:50 crc kubenswrapper[4743]: E0122 14:48:50.749711 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:03 crc kubenswrapper[4743]: I0122 14:49:03.766762 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:49:03 crc kubenswrapper[4743]: E0122 14:49:03.767537 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:15 crc kubenswrapper[4743]: I0122 14:49:15.748060 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:49:15 crc kubenswrapper[4743]: E0122 14:49:15.749312 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:26 crc kubenswrapper[4743]: I0122 14:49:26.747385 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:49:26 crc kubenswrapper[4743]: E0122 14:49:26.748216 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:38 crc kubenswrapper[4743]: I0122 14:49:38.748296 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:49:38 crc kubenswrapper[4743]: E0122 14:49:38.749432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:42 crc kubenswrapper[4743]: I0122 14:49:42.663561 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9762560-529f-41e5-8a82-5840434a3d10" containerID="7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761" exitCode=0 Jan 22 14:49:42 crc kubenswrapper[4743]: I0122 14:49:42.663658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" event={"ID":"a9762560-529f-41e5-8a82-5840434a3d10","Type":"ContainerDied","Data":"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761"} Jan 22 14:49:42 crc kubenswrapper[4743]: I0122 14:49:42.665169 4743 scope.go:117] "RemoveContainer" containerID="7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761" Jan 22 14:49:43 crc kubenswrapper[4743]: I0122 14:49:43.624016 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gtw4_must-gather-lvsbh_a9762560-529f-41e5-8a82-5840434a3d10/gather/0.log" Jan 22 14:49:52 crc kubenswrapper[4743]: I0122 14:49:52.748176 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:49:52 crc kubenswrapper[4743]: E0122 14:49:52.749209 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.075650 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gtw4/must-gather-lvsbh"] Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.076186 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="copy" containerID="cri-o://a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c" gracePeriod=2 Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.091078 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gtw4/must-gather-lvsbh"] Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.581746 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gtw4_must-gather-lvsbh_a9762560-529f-41e5-8a82-5840434a3d10/copy/0.log" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.582736 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.681616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krv4c\" (UniqueName: \"kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c\") pod \"a9762560-529f-41e5-8a82-5840434a3d10\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.681837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output\") pod \"a9762560-529f-41e5-8a82-5840434a3d10\" (UID: \"a9762560-529f-41e5-8a82-5840434a3d10\") " Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.691901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c" (OuterVolumeSpecName: "kube-api-access-krv4c") pod "a9762560-529f-41e5-8a82-5840434a3d10" (UID: "a9762560-529f-41e5-8a82-5840434a3d10"). InnerVolumeSpecName "kube-api-access-krv4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.777259 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gtw4_must-gather-lvsbh_a9762560-529f-41e5-8a82-5840434a3d10/copy/0.log" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.778275 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9762560-529f-41e5-8a82-5840434a3d10" containerID="a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c" exitCode=143 Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.778306 4743 scope.go:117] "RemoveContainer" containerID="a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.778921 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gtw4/must-gather-lvsbh" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.786564 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krv4c\" (UniqueName: \"kubernetes.io/projected/a9762560-529f-41e5-8a82-5840434a3d10-kube-api-access-krv4c\") on node \"crc\" DevicePath \"\"" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.796417 4743 scope.go:117] "RemoveContainer" containerID="7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.837078 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a9762560-529f-41e5-8a82-5840434a3d10" (UID: "a9762560-529f-41e5-8a82-5840434a3d10"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.877433 4743 scope.go:117] "RemoveContainer" containerID="a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c" Jan 22 14:49:53 crc kubenswrapper[4743]: E0122 14:49:53.877960 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c\": container with ID starting with a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c not found: ID does not exist" containerID="a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.878005 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c"} err="failed to get container status \"a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c\": rpc error: code = NotFound desc = could not find container \"a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c\": container with ID starting with a1527e5aa4fd33ff720a648f23d4e1cd5e3a5f119506cbd680b8478fc3572a8c not found: ID does not exist" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.878037 4743 scope.go:117] "RemoveContainer" containerID="7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761" Jan 22 14:49:53 crc kubenswrapper[4743]: E0122 14:49:53.878432 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761\": container with ID starting with 7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761 not found: ID does not exist" containerID="7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.878466 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761"} err="failed to get container status \"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761\": rpc error: code = NotFound desc = could not find container \"7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761\": container with ID starting with 7fd173bb1a16d7e9bb1e809168eb6dd56f8c7c2ca5f512296adf040263f85761 not found: ID does not exist" Jan 22 14:49:53 crc kubenswrapper[4743]: I0122 14:49:53.888838 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9762560-529f-41e5-8a82-5840434a3d10-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 22 14:49:55 crc kubenswrapper[4743]: I0122 14:49:55.759012 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9762560-529f-41e5-8a82-5840434a3d10" path="/var/lib/kubelet/pods/a9762560-529f-41e5-8a82-5840434a3d10/volumes" Jan 22 14:50:06 crc kubenswrapper[4743]: I0122 14:50:06.748561 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:50:07 crc kubenswrapper[4743]: I0122 14:50:07.928742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773"} Jan 22 14:51:35 crc kubenswrapper[4743]: I0122 14:51:35.665915 4743 scope.go:117] "RemoveContainer" containerID="73cf73c3dbc23986bf942cf4d091103df7f52bf2501631a7b1239a402710f91f" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.071374 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:51:57 crc kubenswrapper[4743]: E0122 14:51:57.072468 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="gather" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072488 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="gather" Jan 22 14:51:57 crc kubenswrapper[4743]: E0122 14:51:57.072512 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="registry-server" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072523 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="registry-server" Jan 22 14:51:57 crc kubenswrapper[4743]: E0122 14:51:57.072539 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="copy" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072549 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="copy" Jan 22 14:51:57 crc kubenswrapper[4743]: E0122 14:51:57.072586 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="extract-content" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072598 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="extract-content" Jan 22 14:51:57 crc kubenswrapper[4743]: E0122 14:51:57.072626 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="extract-utilities" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072637 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="extract-utilities" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.072981 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f82082-5738-4558-90f0-19031c102c11" containerName="registry-server" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.073013 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="copy" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.073037 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9762560-529f-41e5-8a82-5840434a3d10" containerName="gather" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.077519 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.093539 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.213381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.213574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmsqt\" (UniqueName: \"kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.214078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.315753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.315860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.315921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmsqt\" (UniqueName: \"kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.316277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.316481 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.340729 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmsqt\" (UniqueName: \"kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt\") pod \"redhat-operators-4kmlc\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.397935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.856753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:51:57 crc kubenswrapper[4743]: I0122 14:51:57.944258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerStarted","Data":"81699dd1e91b6db5f28a1fc4d5bafe5e8296d8bafe1d443ebfebc1aba7d3eb1a"} Jan 22 14:51:58 crc kubenswrapper[4743]: I0122 14:51:58.955100 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerID="fe7d0498102b0b7fd8dbbf9a430539d0dcf99cff8acafd3ef458135b7dab1571" exitCode=0 Jan 22 14:51:58 crc kubenswrapper[4743]: I0122 14:51:58.955181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerDied","Data":"fe7d0498102b0b7fd8dbbf9a430539d0dcf99cff8acafd3ef458135b7dab1571"} Jan 22 14:51:58 crc kubenswrapper[4743]: I0122 14:51:58.957957 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:52:00 crc kubenswrapper[4743]: I0122 14:52:00.974553 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerID="e4d3c03d133e9d20cbc685e416cbec4459606c6ef122f2c3e996162c19492420" exitCode=0 Jan 22 14:52:00 crc kubenswrapper[4743]: I0122 14:52:00.974634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerDied","Data":"e4d3c03d133e9d20cbc685e416cbec4459606c6ef122f2c3e996162c19492420"} Jan 22 14:52:01 crc kubenswrapper[4743]: I0122 14:52:01.986596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerStarted","Data":"e692ec2502355420b5c6abbc625b5ece59d866aad2999380de128641c1d0c0f6"} Jan 22 14:52:02 crc kubenswrapper[4743]: I0122 14:52:02.010925 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4kmlc" podStartSLOduration=2.548493186 podStartE2EDuration="5.010906908s" podCreationTimestamp="2026-01-22 14:51:57 +0000 UTC" firstStartedPulling="2026-01-22 14:51:58.957717968 +0000 UTC m=+3955.512761131" lastFinishedPulling="2026-01-22 14:52:01.42013169 +0000 UTC m=+3957.975174853" observedRunningTime="2026-01-22 14:52:02.008028911 +0000 UTC m=+3958.563072084" watchObservedRunningTime="2026-01-22 14:52:02.010906908 +0000 UTC m=+3958.565950071" Jan 22 14:52:07 crc kubenswrapper[4743]: I0122 14:52:07.398518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:07 crc kubenswrapper[4743]: I0122 14:52:07.398966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:07 crc kubenswrapper[4743]: I0122 14:52:07.448059 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:08 crc kubenswrapper[4743]: I0122 14:52:08.112562 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:08 crc kubenswrapper[4743]: I0122 14:52:08.158671 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:52:10 crc kubenswrapper[4743]: I0122 14:52:10.082525 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4kmlc" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="registry-server" containerID="cri-o://e692ec2502355420b5c6abbc625b5ece59d866aad2999380de128641c1d0c0f6" gracePeriod=2 Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.093662 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerID="e692ec2502355420b5c6abbc625b5ece59d866aad2999380de128641c1d0c0f6" exitCode=0 Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.093722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerDied","Data":"e692ec2502355420b5c6abbc625b5ece59d866aad2999380de128641c1d0c0f6"} Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.094240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kmlc" event={"ID":"1e614181-bcd8-450c-95be-5e082cb0b67c","Type":"ContainerDied","Data":"81699dd1e91b6db5f28a1fc4d5bafe5e8296d8bafe1d443ebfebc1aba7d3eb1a"} Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.094273 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81699dd1e91b6db5f28a1fc4d5bafe5e8296d8bafe1d443ebfebc1aba7d3eb1a" Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.118674 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.294555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities\") pod \"1e614181-bcd8-450c-95be-5e082cb0b67c\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.294862 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content\") pod \"1e614181-bcd8-450c-95be-5e082cb0b67c\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.294892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmsqt\" (UniqueName: \"kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt\") pod \"1e614181-bcd8-450c-95be-5e082cb0b67c\" (UID: \"1e614181-bcd8-450c-95be-5e082cb0b67c\") " Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.296355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities" (OuterVolumeSpecName: "utilities") pod "1e614181-bcd8-450c-95be-5e082cb0b67c" (UID: "1e614181-bcd8-450c-95be-5e082cb0b67c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.300740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt" (OuterVolumeSpecName: "kube-api-access-pmsqt") pod "1e614181-bcd8-450c-95be-5e082cb0b67c" (UID: "1e614181-bcd8-450c-95be-5e082cb0b67c"). InnerVolumeSpecName "kube-api-access-pmsqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.396906 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmsqt\" (UniqueName: \"kubernetes.io/projected/1e614181-bcd8-450c-95be-5e082cb0b67c-kube-api-access-pmsqt\") on node \"crc\" DevicePath \"\"" Jan 22 14:52:11 crc kubenswrapper[4743]: I0122 14:52:11.396940 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:52:12 crc kubenswrapper[4743]: I0122 14:52:12.100352 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kmlc" Jan 22 14:52:12 crc kubenswrapper[4743]: I0122 14:52:12.364819 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e614181-bcd8-450c-95be-5e082cb0b67c" (UID: "1e614181-bcd8-450c-95be-5e082cb0b67c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:52:12 crc kubenswrapper[4743]: I0122 14:52:12.413016 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e614181-bcd8-450c-95be-5e082cb0b67c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:52:12 crc kubenswrapper[4743]: I0122 14:52:12.435858 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:52:12 crc kubenswrapper[4743]: I0122 14:52:12.445521 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4kmlc"] Jan 22 14:52:13 crc kubenswrapper[4743]: I0122 14:52:13.786424 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" path="/var/lib/kubelet/pods/1e614181-bcd8-450c-95be-5e082cb0b67c/volumes" Jan 22 14:52:30 crc kubenswrapper[4743]: I0122 14:52:30.049357 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:52:30 crc kubenswrapper[4743]: I0122 14:52:30.050389 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.482846 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t8dcr/must-gather-krzft"] Jan 22 14:52:46 crc kubenswrapper[4743]: E0122 14:52:46.484011 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="extract-content" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.484036 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="extract-content" Jan 22 14:52:46 crc kubenswrapper[4743]: E0122 14:52:46.484095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="extract-utilities" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.484108 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="extract-utilities" Jan 22 14:52:46 crc kubenswrapper[4743]: E0122 14:52:46.484183 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="registry-server" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.484226 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="registry-server" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.484648 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e614181-bcd8-450c-95be-5e082cb0b67c" containerName="registry-server" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.485667 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.488982 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t8dcr"/"openshift-service-ca.crt" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.491353 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t8dcr"/"kube-root-ca.crt" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.509034 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t8dcr/must-gather-krzft"] Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.534607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.534678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9br\" (UniqueName: \"kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.636853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.636900 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9br\" (UniqueName: \"kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.637284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.659628 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9br\" (UniqueName: \"kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br\") pod \"must-gather-krzft\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:46 crc kubenswrapper[4743]: I0122 14:52:46.804533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:52:47 crc kubenswrapper[4743]: I0122 14:52:47.242619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t8dcr/must-gather-krzft"] Jan 22 14:52:47 crc kubenswrapper[4743]: I0122 14:52:47.457909 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/must-gather-krzft" event={"ID":"599b58e0-a9f5-49ff-ae29-7ab848cd6f88","Type":"ContainerStarted","Data":"d61f5969cf8261b6fc26153694492d906148bb215a6ac561d53ba3650c4e8b00"} Jan 22 14:52:48 crc kubenswrapper[4743]: I0122 14:52:48.482507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/must-gather-krzft" event={"ID":"599b58e0-a9f5-49ff-ae29-7ab848cd6f88","Type":"ContainerStarted","Data":"73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24"} Jan 22 14:52:48 crc kubenswrapper[4743]: I0122 14:52:48.483302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/must-gather-krzft" event={"ID":"599b58e0-a9f5-49ff-ae29-7ab848cd6f88","Type":"ContainerStarted","Data":"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684"} Jan 22 14:52:48 crc kubenswrapper[4743]: I0122 14:52:48.522077 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t8dcr/must-gather-krzft" podStartSLOduration=2.522030265 podStartE2EDuration="2.522030265s" podCreationTimestamp="2026-01-22 14:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:52:48.503565421 +0000 UTC m=+4005.058608584" watchObservedRunningTime="2026-01-22 14:52:48.522030265 +0000 UTC m=+4005.077073438" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.588037 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-w92tv"] Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.589758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.591823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t8dcr"/"default-dockercfg-pvc5d" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.633956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.634038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hq49\" (UniqueName: \"kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.735573 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.735643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hq49\" (UniqueName: \"kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.735704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.767663 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hq49\" (UniqueName: \"kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49\") pod \"crc-debug-w92tv\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:51 crc kubenswrapper[4743]: I0122 14:52:51.907244 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:52:52 crc kubenswrapper[4743]: I0122 14:52:52.516761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" event={"ID":"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb","Type":"ContainerStarted","Data":"e782eaa9c90f52ed5030624c4fa569f8ced031e9746c42aeb8368271acf28142"} Jan 22 14:52:52 crc kubenswrapper[4743]: I0122 14:52:52.517464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" event={"ID":"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb","Type":"ContainerStarted","Data":"542f1af8935a8837fca61afa1184f6ffa6d013cb5a8110f21ce252eff73caee0"} Jan 22 14:52:52 crc kubenswrapper[4743]: I0122 14:52:52.544309 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" podStartSLOduration=1.544292784 podStartE2EDuration="1.544292784s" podCreationTimestamp="2026-01-22 14:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:52:52.539025163 +0000 UTC m=+4009.094068336" watchObservedRunningTime="2026-01-22 14:52:52.544292784 +0000 UTC m=+4009.099335947" Jan 22 14:53:00 crc kubenswrapper[4743]: I0122 14:53:00.049275 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:53:00 crc kubenswrapper[4743]: I0122 14:53:00.049861 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.071246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.074108 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.084004 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.135194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.135260 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.135296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fbz\" (UniqueName: \"kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.236580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.236653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.236697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fbz\" (UniqueName: \"kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.237161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.237197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.257470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fbz\" (UniqueName: \"kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz\") pod \"community-operators-jbwvj\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.396638 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:12 crc kubenswrapper[4743]: I0122 14:53:12.915597 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:13 crc kubenswrapper[4743]: I0122 14:53:13.714986 4743 generic.go:334] "Generic (PLEG): container finished" podID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerID="15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7" exitCode=0 Jan 22 14:53:13 crc kubenswrapper[4743]: I0122 14:53:13.715272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerDied","Data":"15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7"} Jan 22 14:53:13 crc kubenswrapper[4743]: I0122 14:53:13.715302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerStarted","Data":"4f7faa402324246329a17bdc42caa6ce7f5973bf1721783fbd140ddcabd7dbb9"} Jan 22 14:53:15 crc kubenswrapper[4743]: I0122 14:53:15.735931 4743 generic.go:334] "Generic (PLEG): container finished" podID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerID="9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278" exitCode=0 Jan 22 14:53:15 crc kubenswrapper[4743]: I0122 14:53:15.738717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerDied","Data":"9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278"} Jan 22 14:53:16 crc kubenswrapper[4743]: I0122 14:53:16.746664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerStarted","Data":"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e"} Jan 22 14:53:16 crc kubenswrapper[4743]: I0122 14:53:16.771386 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jbwvj" podStartSLOduration=2.245252769 podStartE2EDuration="4.771366426s" podCreationTimestamp="2026-01-22 14:53:12 +0000 UTC" firstStartedPulling="2026-01-22 14:53:13.717245431 +0000 UTC m=+4030.272288594" lastFinishedPulling="2026-01-22 14:53:16.243359098 +0000 UTC m=+4032.798402251" observedRunningTime="2026-01-22 14:53:16.771171751 +0000 UTC m=+4033.326214924" watchObservedRunningTime="2026-01-22 14:53:16.771366426 +0000 UTC m=+4033.326409589" Jan 22 14:53:22 crc kubenswrapper[4743]: I0122 14:53:22.397161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:22 crc kubenswrapper[4743]: I0122 14:53:22.397733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:22 crc kubenswrapper[4743]: I0122 14:53:22.448217 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:22 crc kubenswrapper[4743]: I0122 14:53:22.845170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:22 crc kubenswrapper[4743]: I0122 14:53:22.909038 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:24 crc kubenswrapper[4743]: I0122 14:53:24.811577 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jbwvj" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="registry-server" containerID="cri-o://983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e" gracePeriod=2 Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.294694 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.391499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content\") pod \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.391574 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4fbz\" (UniqueName: \"kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz\") pod \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.391616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities\") pod \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\" (UID: \"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63\") " Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.392935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities" (OuterVolumeSpecName: "utilities") pod "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" (UID: "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.399025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz" (OuterVolumeSpecName: "kube-api-access-h4fbz") pod "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" (UID: "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63"). InnerVolumeSpecName "kube-api-access-h4fbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.458824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" (UID: "6aeb2b00-6a7d-4bb3-ab20-a96079f19f63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.493660 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.493696 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4fbz\" (UniqueName: \"kubernetes.io/projected/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-kube-api-access-h4fbz\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.493709 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.823656 4743 generic.go:334] "Generic (PLEG): container finished" podID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerID="983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e" exitCode=0 Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.824565 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbwvj" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.825311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerDied","Data":"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e"} Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.825378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbwvj" event={"ID":"6aeb2b00-6a7d-4bb3-ab20-a96079f19f63","Type":"ContainerDied","Data":"4f7faa402324246329a17bdc42caa6ce7f5973bf1721783fbd140ddcabd7dbb9"} Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.825425 4743 scope.go:117] "RemoveContainer" containerID="983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.830750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" event={"ID":"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb","Type":"ContainerDied","Data":"e782eaa9c90f52ed5030624c4fa569f8ced031e9746c42aeb8368271acf28142"} Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.830694 4743 generic.go:334] "Generic (PLEG): container finished" podID="53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" containerID="e782eaa9c90f52ed5030624c4fa569f8ced031e9746c42aeb8368271acf28142" exitCode=0 Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.866965 4743 scope.go:117] "RemoveContainer" containerID="9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.872314 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.886942 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jbwvj"] Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.906937 4743 scope.go:117] "RemoveContainer" containerID="15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.949054 4743 scope.go:117] "RemoveContainer" containerID="983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e" Jan 22 14:53:25 crc kubenswrapper[4743]: E0122 14:53:25.949511 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e\": container with ID starting with 983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e not found: ID does not exist" containerID="983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.949558 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e"} err="failed to get container status \"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e\": rpc error: code = NotFound desc = could not find container \"983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e\": container with ID starting with 983486291c08ebf9916faac06bb3ea41a76a4ae6fea4f37a9581ebc8341cf05e not found: ID does not exist" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.949581 4743 scope.go:117] "RemoveContainer" containerID="9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278" Jan 22 14:53:25 crc kubenswrapper[4743]: E0122 14:53:25.950111 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278\": container with ID starting with 9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278 not found: ID does not exist" containerID="9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.950154 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278"} err="failed to get container status \"9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278\": rpc error: code = NotFound desc = could not find container \"9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278\": container with ID starting with 9c066a728ac60a6375c3cb7d4403947e23fe9c44f2a2532e9435ccd10b02d278 not found: ID does not exist" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.950181 4743 scope.go:117] "RemoveContainer" containerID="15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7" Jan 22 14:53:25 crc kubenswrapper[4743]: E0122 14:53:25.950587 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7\": container with ID starting with 15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7 not found: ID does not exist" containerID="15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7" Jan 22 14:53:25 crc kubenswrapper[4743]: I0122 14:53:25.950615 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7"} err="failed to get container status \"15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7\": rpc error: code = NotFound desc = could not find container \"15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7\": container with ID starting with 15336cfa873a005847c5d4cae0860e3fe3a5e530edf23a21a609657d6bfc6bd7 not found: ID does not exist" Jan 22 14:53:26 crc kubenswrapper[4743]: I0122 14:53:26.955619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:53:26 crc kubenswrapper[4743]: I0122 14:53:26.989923 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-w92tv"] Jan 22 14:53:26 crc kubenswrapper[4743]: I0122 14:53:26.997834 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-w92tv"] Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.022705 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host\") pod \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.022945 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hq49\" (UniqueName: \"kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49\") pod \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\" (UID: \"53e6a2d4-7deb-42c8-91c0-eb72becbf1eb\") " Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.022825 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host" (OuterVolumeSpecName: "host") pod "53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" (UID: "53e6a2d4-7deb-42c8-91c0-eb72becbf1eb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.023428 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.040986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49" (OuterVolumeSpecName: "kube-api-access-9hq49") pod "53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" (UID: "53e6a2d4-7deb-42c8-91c0-eb72becbf1eb"). InnerVolumeSpecName "kube-api-access-9hq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.125690 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hq49\" (UniqueName: \"kubernetes.io/projected/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb-kube-api-access-9hq49\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.759455 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" path="/var/lib/kubelet/pods/53e6a2d4-7deb-42c8-91c0-eb72becbf1eb/volumes" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.760445 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" path="/var/lib/kubelet/pods/6aeb2b00-6a7d-4bb3-ab20-a96079f19f63/volumes" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.851692 4743 scope.go:117] "RemoveContainer" containerID="e782eaa9c90f52ed5030624c4fa569f8ced031e9746c42aeb8368271acf28142" Jan 22 14:53:27 crc kubenswrapper[4743]: I0122 14:53:27.851731 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-w92tv" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159279 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-hbwrz"] Jan 22 14:53:28 crc kubenswrapper[4743]: E0122 14:53:28.159649 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" containerName="container-00" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159660 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" containerName="container-00" Jan 22 14:53:28 crc kubenswrapper[4743]: E0122 14:53:28.159670 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="extract-content" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159677 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="extract-content" Jan 22 14:53:28 crc kubenswrapper[4743]: E0122 14:53:28.159692 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="extract-utilities" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159698 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="extract-utilities" Jan 22 14:53:28 crc kubenswrapper[4743]: E0122 14:53:28.159712 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="registry-server" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159720 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="registry-server" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159939 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aeb2b00-6a7d-4bb3-ab20-a96079f19f63" containerName="registry-server" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.159965 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e6a2d4-7deb-42c8-91c0-eb72becbf1eb" containerName="container-00" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.160592 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.164134 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t8dcr"/"default-dockercfg-pvc5d" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.244439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2rd\" (UniqueName: \"kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.244520 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.346145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.346308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2rd\" (UniqueName: \"kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.346317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.475087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2rd\" (UniqueName: \"kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd\") pod \"crc-debug-hbwrz\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.477544 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.862670 4743 generic.go:334] "Generic (PLEG): container finished" podID="d41913bd-d037-47ca-ad78-7811aea44b29" containerID="6c7aabd99f2c98dbff5cf34c7882b0c0496079e74ae887b3cfd70082e8328a70" exitCode=0 Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.862781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" event={"ID":"d41913bd-d037-47ca-ad78-7811aea44b29","Type":"ContainerDied","Data":"6c7aabd99f2c98dbff5cf34c7882b0c0496079e74ae887b3cfd70082e8328a70"} Jan 22 14:53:28 crc kubenswrapper[4743]: I0122 14:53:28.862930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" event={"ID":"d41913bd-d037-47ca-ad78-7811aea44b29","Type":"ContainerStarted","Data":"b626c16da778ffb90a1091fda8b4ad097d7634cc82b9df959734c257895dbb0f"} Jan 22 14:53:29 crc kubenswrapper[4743]: I0122 14:53:29.293918 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-hbwrz"] Jan 22 14:53:29 crc kubenswrapper[4743]: I0122 14:53:29.302665 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-hbwrz"] Jan 22 14:53:29 crc kubenswrapper[4743]: I0122 14:53:29.997491 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.049475 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.049533 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.049592 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.050402 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.050472 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773" gracePeriod=600 Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.076306 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host\") pod \"d41913bd-d037-47ca-ad78-7811aea44b29\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.076443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host" (OuterVolumeSpecName: "host") pod "d41913bd-d037-47ca-ad78-7811aea44b29" (UID: "d41913bd-d037-47ca-ad78-7811aea44b29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.076501 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2rd\" (UniqueName: \"kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd\") pod \"d41913bd-d037-47ca-ad78-7811aea44b29\" (UID: \"d41913bd-d037-47ca-ad78-7811aea44b29\") " Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.077857 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41913bd-d037-47ca-ad78-7811aea44b29-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.098969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd" (OuterVolumeSpecName: "kube-api-access-qg2rd") pod "d41913bd-d037-47ca-ad78-7811aea44b29" (UID: "d41913bd-d037-47ca-ad78-7811aea44b29"). InnerVolumeSpecName "kube-api-access-qg2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.180102 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2rd\" (UniqueName: \"kubernetes.io/projected/d41913bd-d037-47ca-ad78-7811aea44b29-kube-api-access-qg2rd\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.491356 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-z2kkd"] Jan 22 14:53:30 crc kubenswrapper[4743]: E0122 14:53:30.492169 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41913bd-d037-47ca-ad78-7811aea44b29" containerName="container-00" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.492183 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41913bd-d037-47ca-ad78-7811aea44b29" containerName="container-00" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.493841 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41913bd-d037-47ca-ad78-7811aea44b29" containerName="container-00" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.494443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.587196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.587252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw5vw\" (UniqueName: \"kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.689051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.689104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw5vw\" (UniqueName: \"kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.689175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.714184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw5vw\" (UniqueName: \"kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw\") pod \"crc-debug-z2kkd\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.813982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:30 crc kubenswrapper[4743]: W0122 14:53:30.840864 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8f5b25_2472_4c5e_8e9b_3fd7d5b4e6ac.slice/crio-c787fdc22f32690c809f07b79710ecc2e738730709a4d857c39afc20a9a72624 WatchSource:0}: Error finding container c787fdc22f32690c809f07b79710ecc2e738730709a4d857c39afc20a9a72624: Status 404 returned error can't find the container with id c787fdc22f32690c809f07b79710ecc2e738730709a4d857c39afc20a9a72624 Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.907440 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773" exitCode=0 Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.907537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773"} Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.907568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f"} Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.907593 4743 scope.go:117] "RemoveContainer" containerID="25b51fa71f51dca1867d5b32f16ec36e148821f095337178c440ebff508e0294" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.913189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-hbwrz" Jan 22 14:53:30 crc kubenswrapper[4743]: I0122 14:53:30.915537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" event={"ID":"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac","Type":"ContainerStarted","Data":"c787fdc22f32690c809f07b79710ecc2e738730709a4d857c39afc20a9a72624"} Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.003654 4743 scope.go:117] "RemoveContainer" containerID="6c7aabd99f2c98dbff5cf34c7882b0c0496079e74ae887b3cfd70082e8328a70" Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.757374 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41913bd-d037-47ca-ad78-7811aea44b29" path="/var/lib/kubelet/pods/d41913bd-d037-47ca-ad78-7811aea44b29/volumes" Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.925980 4743 generic.go:334] "Generic (PLEG): container finished" podID="af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" containerID="746c85c3ef396782018666afc5b0836d235ff3f1544e3040480a109dc8d1b362" exitCode=0 Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.926112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" event={"ID":"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac","Type":"ContainerDied","Data":"746c85c3ef396782018666afc5b0836d235ff3f1544e3040480a109dc8d1b362"} Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.985572 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-z2kkd"] Jan 22 14:53:31 crc kubenswrapper[4743]: I0122 14:53:31.994219 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t8dcr/crc-debug-z2kkd"] Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.028285 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.039260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host\") pod \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.039325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw5vw\" (UniqueName: \"kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw\") pod \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\" (UID: \"af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac\") " Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.039392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host" (OuterVolumeSpecName: "host") pod "af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" (UID: "af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.039753 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-host\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.051278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw" (OuterVolumeSpecName: "kube-api-access-mw5vw") pod "af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" (UID: "af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac"). InnerVolumeSpecName "kube-api-access-mw5vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.141667 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw5vw\" (UniqueName: \"kubernetes.io/projected/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac-kube-api-access-mw5vw\") on node \"crc\" DevicePath \"\"" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.757873 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" path="/var/lib/kubelet/pods/af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac/volumes" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.946940 4743 scope.go:117] "RemoveContainer" containerID="746c85c3ef396782018666afc5b0836d235ff3f1544e3040480a109dc8d1b362" Jan 22 14:53:33 crc kubenswrapper[4743]: I0122 14:53:33.946987 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/crc-debug-z2kkd" Jan 22 14:54:00 crc kubenswrapper[4743]: I0122 14:54:00.936943 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:00 crc kubenswrapper[4743]: E0122 14:54:00.937811 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" containerName="container-00" Jan 22 14:54:00 crc kubenswrapper[4743]: I0122 14:54:00.937945 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" containerName="container-00" Jan 22 14:54:00 crc kubenswrapper[4743]: I0122 14:54:00.938239 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8f5b25-2472-4c5e-8e9b-3fd7d5b4e6ac" containerName="container-00" Jan 22 14:54:00 crc kubenswrapper[4743]: I0122 14:54:00.941758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:00 crc kubenswrapper[4743]: I0122 14:54:00.947085 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.065939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnpb5\" (UniqueName: \"kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.066675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.066833 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.168743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.168906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.168962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnpb5\" (UniqueName: \"kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.169226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.169343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.200055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnpb5\" (UniqueName: \"kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5\") pod \"redhat-marketplace-vdc8d\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.267271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:01 crc kubenswrapper[4743]: I0122 14:54:01.778294 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.188176 4743 generic.go:334] "Generic (PLEG): container finished" podID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerID="65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468" exitCode=0 Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.188235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerDied","Data":"65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468"} Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.188476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerStarted","Data":"8c87e190b75f467cd069fb9eef42165fa9a98298e5ac6fd10d4ff6ae4fd9c7d2"} Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.494892 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-64fcd75458-9rzfr_c4db7649-d1b0-47c2-b5e4-34a552ccee79/barbican-api/0.log" Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.693753 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-64fcd75458-9rzfr_c4db7649-d1b0-47c2-b5e4-34a552ccee79/barbican-api-log/0.log" Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.746589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c88b6769d-nzzc6_a84fcd7a-0eac-4d23-832e-e632bd4f971f/barbican-keystone-listener/0.log" Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.804294 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c88b6769d-nzzc6_a84fcd7a-0eac-4d23-832e-e632bd4f971f/barbican-keystone-listener-log/0.log" Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.920877 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8448f7b79-pndf8_f254cb75-db18-488e-886f-544f0b8a8516/barbican-worker/0.log" Jan 22 14:54:02 crc kubenswrapper[4743]: I0122 14:54:02.978948 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8448f7b79-pndf8_f254cb75-db18-488e-886f-544f0b8a8516/barbican-worker-log/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.109746 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v4vc6_33d8b498-a76a-4549-96c2-f32877beaa30/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.401804 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/ceilometer-central-agent/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.445106 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/ceilometer-notification-agent/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.486271 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/sg-core/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.494679 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_46f75016-697b-4cac-bc9e-3e2f5e60da77/proxy-httpd/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.692510 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d5aa29c0-68de-446c-aafd-50080e4adb51/cinder-api-log/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.699776 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d5aa29c0-68de-446c-aafd-50080e4adb51/cinder-api/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.872051 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ec6e51f6-2808-404d-8cf0-8c8b44c86cb9/cinder-scheduler/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.930841 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ec6e51f6-2808-404d-8cf0-8c8b44c86cb9/probe/0.log" Jan 22 14:54:03 crc kubenswrapper[4743]: I0122 14:54:03.956319 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xph97_89048557-6c94-40a8-aa26-c9d940743be9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.080301 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7pmbs_6cf93e6b-adef-48fb-844b-a420be87fd2e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.166243 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/init/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.206864 4743 generic.go:334] "Generic (PLEG): container finished" podID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerID="820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b" exitCode=0 Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.206915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerDied","Data":"820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b"} Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.368486 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/init/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.415539 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-w8zbz_6772da1b-97c0-4b18-af50-7723f5dc39b6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.442463 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nbpk8_c52cf8e4-1ecd-4882-b076-bacb37f3569e/dnsmasq-dns/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.594222 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5247bc1b-998e-4275-9f4a-d3c30ff488b9/glance-log/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.606618 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5247bc1b-998e-4275-9f4a-d3c30ff488b9/glance-httpd/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.764957 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb/glance-log/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.795671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ed5a9e1-b17c-46c0-9d94-b3ae86e73acb/glance-httpd/0.log" Jan 22 14:54:04 crc kubenswrapper[4743]: I0122 14:54:04.981655 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b7fb54dc6-5q9jf_e452af10-fc11-4854-bf38-8a90856331d3/horizon/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.108084 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7jfwt_beb6ea36-21b3-4658-a609-7ecace4d6efc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.218213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerStarted","Data":"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad"} Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.245071 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdc8d" podStartSLOduration=2.698537441 podStartE2EDuration="5.245049905s" podCreationTimestamp="2026-01-22 14:54:00 +0000 UTC" firstStartedPulling="2026-01-22 14:54:02.191974117 +0000 UTC m=+4078.747017280" lastFinishedPulling="2026-01-22 14:54:04.738486581 +0000 UTC m=+4081.293529744" observedRunningTime="2026-01-22 14:54:05.244931951 +0000 UTC m=+4081.799975114" watchObservedRunningTime="2026-01-22 14:54:05.245049905 +0000 UTC m=+4081.800093058" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.376086 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b7fb54dc6-5q9jf_e452af10-fc11-4854-bf38-8a90856331d3/horizon-log/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.466402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-z8mwg_9b71a0c2-2cf5-4ed8-a0f9-4966d9095eeb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.704201 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5f47b7b66b-mfhcg_cd3df106-ec34-42ad-bf5d-f963b9bb0871/keystone-api/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.731214 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_492b4d6f-25ef-41b4-9aa8-876d9baaaf13/kube-state-metrics/0.log" Jan 22 14:54:05 crc kubenswrapper[4743]: I0122 14:54:05.961714 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cvxq2_5dca488a-cb84-4610-bf38-0f4c65c8b94a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:06 crc kubenswrapper[4743]: I0122 14:54:06.479162 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dd566fb89-mgkw8_36675c4f-99e7-4cfb-a9c4-22519e8e7d4c/neutron-httpd/0.log" Jan 22 14:54:06 crc kubenswrapper[4743]: I0122 14:54:06.489627 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dd566fb89-mgkw8_36675c4f-99e7-4cfb-a9c4-22519e8e7d4c/neutron-api/0.log" Jan 22 14:54:06 crc kubenswrapper[4743]: I0122 14:54:06.701780 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-j94s9_eec02bb6-2380-4910-8e5a-1fe3196760a4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:07 crc kubenswrapper[4743]: I0122 14:54:07.309221 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5e08ea55-209f-4956-b9cb-c261280252ad/nova-api-log/0.log" Jan 22 14:54:07 crc kubenswrapper[4743]: I0122 14:54:07.406649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_188bdbf9-2ed2-427b-99c1-7c435a25a3c6/nova-cell0-conductor-conductor/0.log" Jan 22 14:54:07 crc kubenswrapper[4743]: I0122 14:54:07.629149 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3b26ff8e-b36c-47d8-8d74-da49485ec363/nova-cell1-conductor-conductor/0.log" Jan 22 14:54:07 crc kubenswrapper[4743]: I0122 14:54:07.788380 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5e08ea55-209f-4956-b9cb-c261280252ad/nova-api-api/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.145093 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_41d94ed2-d3ad-4fbc-9ece-a7fb65eab7ac/nova-cell1-novncproxy-novncproxy/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.152574 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jjllf_a546459d-e713-453e-adbd-c3b9f8c7b961/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.454809 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fc42f0d6-9224-404d-8584-2c0fec4f3edd/nova-metadata-log/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.717764 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/mysql-bootstrap/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.762501 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_529c10d9-fb76-4b45-8b08-3d9656bfdcd5/nova-scheduler-scheduler/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.941541 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/mysql-bootstrap/0.log" Jan 22 14:54:08 crc kubenswrapper[4743]: I0122 14:54:08.944883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2644f1c9-b50c-4666-a099-ddb8912a53ff/galera/0.log" Jan 22 14:54:09 crc kubenswrapper[4743]: I0122 14:54:09.176676 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/mysql-bootstrap/0.log" Jan 22 14:54:09 crc kubenswrapper[4743]: I0122 14:54:09.788160 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/galera/0.log" Jan 22 14:54:09 crc kubenswrapper[4743]: I0122 14:54:09.792742 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ee0fd7f6-d02b-4139-9306-8f9c9a1a8dd1/mysql-bootstrap/0.log" Jan 22 14:54:09 crc kubenswrapper[4743]: I0122 14:54:09.806833 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fc42f0d6-9224-404d-8584-2c0fec4f3edd/nova-metadata-metadata/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.049278 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_41abc04c-e711-4e34-a0b0-085b7b09d94d/openstackclient/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.061767 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m22h5_f3551792-b862-492e-8c36-e0a63cd4468f/ovn-controller/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.281581 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rzsrt_3450abf2-6cd6-4090-b26f-4d83e2a6ea2b/openstack-network-exporter/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.292163 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server-init/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.458883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server-init/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.523199 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovsdb-server/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.560063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rmfgh_60598cb3-9d09-4b83-9b5c-893f5ebf44eb/ovs-vswitchd/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.725798 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nm84h_99d2edf2-043a-4066-9d64-36be28d2197d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.780520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c926afa-42b3-4fc2-bc38-8ee725cd113b/openstack-network-exporter/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.857005 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c926afa-42b3-4fc2-bc38-8ee725cd113b/ovn-northd/0.log" Jan 22 14:54:10 crc kubenswrapper[4743]: I0122 14:54:10.974343 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b07a577-785f-4720-919c-ef619448284a/openstack-network-exporter/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.016001 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b07a577-785f-4720-919c-ef619448284a/ovsdbserver-nb/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.186280 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_98d7b7d3-f576-4b98-912f-6e7aab2d295a/ovsdbserver-sb/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.242681 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_98d7b7d3-f576-4b98-912f-6e7aab2d295a/openstack-network-exporter/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.267363 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.268345 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.339578 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.410182 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77bd86cd86-kqp9m_9315e9cf-2a73-482e-810e-8fd19202915f/placement-api/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.476984 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/setup-container/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.526283 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77bd86cd86-kqp9m_9315e9cf-2a73-482e-810e-8fd19202915f/placement-log/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.670354 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/setup-container/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.705670 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42446198-84f4-4bee-b50c-1bb5dad2e380/rabbitmq/0.log" Jan 22 14:54:11 crc kubenswrapper[4743]: I0122 14:54:11.779495 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/setup-container/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.088381 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dxthp_010d8c84-1843-4e5c-85b8-b39df20a58fd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.099694 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/setup-container/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.110168 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_600136f3-db1d-49a2-92a8-0c03aaadc963/rabbitmq/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.333963 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.358751 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-nr24d_92bb5b08-555d-4d1b-b105-e7cf240f190b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.400443 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.419309 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8v7v2_305fa257-7d41-4a05-ae4e-1b945894aa09/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.587111 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-f6dtw_62ef8bcc-609a-4fe6-a41d-48200e08b72f/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.631953 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jvd8s_c41f0818-52ad-4c25-82fa-61a14a9825a1/ssh-known-hosts-edpm-deployment/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.906603 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6df4ffc5-49vw4_33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5/proxy-server/0.log" Jan 22 14:54:12 crc kubenswrapper[4743]: I0122 14:54:12.994327 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6df4ffc5-49vw4_33cc55b1-3375-4b12-9cd9-c8f34ed7c0f5/proxy-httpd/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.010905 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kg2fn_56dff5fb-e22c-4045-b3c4-c75e018df046/swift-ring-rebalance/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.126862 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-auditor/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.220055 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-replicator/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.228529 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-reaper/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.364778 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-auditor/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.371891 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/account-server/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.483654 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-server/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.529165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-replicator/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.604851 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-auditor/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.613756 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/container-updater/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.731472 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-expirer/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.761018 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-replicator/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.784836 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-server/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.794029 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/object-updater/0.log" Jan 22 14:54:13 crc kubenswrapper[4743]: I0122 14:54:13.914519 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/rsync/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.005541 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_338e196f-7c64-4cbd-b058-768ccb4c5df9/swift-recon-cron/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.122071 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5v5f6_65113c72-73df-4a17-b923-60f9da824feb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.225062 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_dca0d9c1-5628-4b93-9696-f9d455c70f31/tempest-tests-tempest-tests-runner/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.308635 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdc8d" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="registry-server" containerID="cri-o://a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad" gracePeriod=2 Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.358381 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_058ed9a9-b1d8-4e1f-b6b7-ca4bd89b79d7/test-operator-logs-container/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.427005 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-288lh_600f8b94-291e-4c03-b5d8-75f43de51d1d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.805058 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.847161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnpb5\" (UniqueName: \"kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5\") pod \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.847337 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content\") pod \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.847457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities\") pod \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\" (UID: \"7533ea4b-59e4-45c7-9f52-93aa1f4baa47\") " Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.848533 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities" (OuterVolumeSpecName: "utilities") pod "7533ea4b-59e4-45c7-9f52-93aa1f4baa47" (UID: "7533ea4b-59e4-45c7-9f52-93aa1f4baa47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.855220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5" (OuterVolumeSpecName: "kube-api-access-lnpb5") pod "7533ea4b-59e4-45c7-9f52-93aa1f4baa47" (UID: "7533ea4b-59e4-45c7-9f52-93aa1f4baa47"). InnerVolumeSpecName "kube-api-access-lnpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.873195 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7533ea4b-59e4-45c7-9f52-93aa1f4baa47" (UID: "7533ea4b-59e4-45c7-9f52-93aa1f4baa47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.948986 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.949327 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:54:14 crc kubenswrapper[4743]: I0122 14:54:14.949390 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnpb5\" (UniqueName: \"kubernetes.io/projected/7533ea4b-59e4-45c7-9f52-93aa1f4baa47-kube-api-access-lnpb5\") on node \"crc\" DevicePath \"\"" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.320012 4743 generic.go:334] "Generic (PLEG): container finished" podID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerID="a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad" exitCode=0 Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.320064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerDied","Data":"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad"} Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.320067 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc8d" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.320097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc8d" event={"ID":"7533ea4b-59e4-45c7-9f52-93aa1f4baa47","Type":"ContainerDied","Data":"8c87e190b75f467cd069fb9eef42165fa9a98298e5ac6fd10d4ff6ae4fd9c7d2"} Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.320123 4743 scope.go:117] "RemoveContainer" containerID="a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.346926 4743 scope.go:117] "RemoveContainer" containerID="820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.388990 4743 scope.go:117] "RemoveContainer" containerID="65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.406507 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.436601 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc8d"] Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.445622 4743 scope.go:117] "RemoveContainer" containerID="a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad" Jan 22 14:54:15 crc kubenswrapper[4743]: E0122 14:54:15.447157 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad\": container with ID starting with a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad not found: ID does not exist" containerID="a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.447196 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad"} err="failed to get container status \"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad\": rpc error: code = NotFound desc = could not find container \"a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad\": container with ID starting with a9371b4ec9258dcb6a26ef55bb02e610c89ec8e6264b0c7f1936973c1eb597ad not found: ID does not exist" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.447218 4743 scope.go:117] "RemoveContainer" containerID="820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b" Jan 22 14:54:15 crc kubenswrapper[4743]: E0122 14:54:15.447500 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b\": container with ID starting with 820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b not found: ID does not exist" containerID="820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.447558 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b"} err="failed to get container status \"820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b\": rpc error: code = NotFound desc = could not find container \"820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b\": container with ID starting with 820eaea253a3e03f7b0b1285403d4cb396294faf8620eae8e39839ed992f427b not found: ID does not exist" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.447595 4743 scope.go:117] "RemoveContainer" containerID="65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468" Jan 22 14:54:15 crc kubenswrapper[4743]: E0122 14:54:15.447968 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468\": container with ID starting with 65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468 not found: ID does not exist" containerID="65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.448018 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468"} err="failed to get container status \"65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468\": rpc error: code = NotFound desc = could not find container \"65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468\": container with ID starting with 65a196e3ad770f9b17d97a5ea4aa07557d2318d19dc78c4dfe8179b3cf144468 not found: ID does not exist" Jan 22 14:54:15 crc kubenswrapper[4743]: I0122 14:54:15.766854 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" path="/var/lib/kubelet/pods/7533ea4b-59e4-45c7-9f52-93aa1f4baa47/volumes" Jan 22 14:54:24 crc kubenswrapper[4743]: I0122 14:54:24.467442 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_63d64b7b-89b2-468c-86e2-fe9de4338c0c/memcached/0.log" Jan 22 14:54:42 crc kubenswrapper[4743]: I0122 14:54:42.839958 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59dd8b7cbf-6kvwx_4eb53c43-8c71-4c15-862a-134fa6eb85d6/manager/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.234754 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-69cf5d4557-dn2mv_f6b9f418-b721-4fce-881e-791eceb6b0ef/manager/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.298487 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-w8s2s_aff36600-9c00-4a26-b311-a3d743333b0e/manager/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.461355 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.609876 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.610593 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.619443 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.843688 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/util/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.846920 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/extract/0.log" Jan 22 14:54:43 crc kubenswrapper[4743]: I0122 14:54:43.858569 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb228b644d9a880fdedde3cfb113973472d7d7039cbb0f6e8a8e58fb345dhsd_3c8fbd52-c53e-4cb9-9087-483276a7c607/pull/0.log" Jan 22 14:54:44 crc kubenswrapper[4743]: I0122 14:54:44.066524 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-mr8bn_3454a999-851a-47d1-ba12-64f77de4bd6a/manager/0.log" Jan 22 14:54:44 crc kubenswrapper[4743]: I0122 14:54:44.118773 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-mxdkr_bad16498-5eda-4791-8577-6cf6ef07ca2a/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.056945 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-f7dhp_1fcc87bc-de60-44e2-b8b9-88c97eb2aec4/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.269215 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54ccf4f85d-8xxtr_7df36228-9543-4bb1-a0a7-d2ca51ac35a5/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.276811 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-69d6c9f5b8-nqr46_6221bb17-765b-4d72-8a74-70cdbc3447d9/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.445627 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-24n5k_3e262e2d-6d13-4c04-9826-14ed89dde8ea/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.504739 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-b77x5_96859e4c-bbb4-424b-bc02-2bd6e3b03484/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.703742 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-jtwg6_924b89fa-b3de-46d6-b9c8-5be5e6d4795c/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.836046 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5d8f59fb49-mhq65_064ac5cb-7d15-4502-b174-54236cdd0d51/manager/0.log" Jan 22 14:54:45 crc kubenswrapper[4743]: I0122 14:54:45.964337 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6b8bc8d87d-mppfg_d1e52325-1801-4650-86bd-c1eb8f076714/manager/0.log" Jan 22 14:54:46 crc kubenswrapper[4743]: I0122 14:54:46.013404 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7bd9774b6-qfb2q_3b59a905-2607-4445-abee-ba43a1bdf41c/manager/0.log" Jan 22 14:54:46 crc kubenswrapper[4743]: I0122 14:54:46.642209 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854q5znv_4a00e91a-fbd8-496e-96e0-4fb25d7841fe/manager/0.log" Jan 22 14:54:46 crc kubenswrapper[4743]: I0122 14:54:46.774626 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6ddb855d8-zmpc6_80c8233a-0396-4d24-8212-53346af8d405/operator/0.log" Jan 22 14:54:46 crc kubenswrapper[4743]: I0122 14:54:46.897022 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dsm72_2ac18d53-2c89-4de9-8665-29d227f67a09/registry-server/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.021386 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-995fd_6b4ae3c8-6f7f-4b76-91c0-3652f86422a6/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.150406 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5d646b7d76-wsnt4_ee1a4864-faf3-49da-ac1a-ab864c677803/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.289703 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jzmrn_36be36d1-45a3-4e18-ba83-e4ae61363409/operator/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.454877 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-nck8q_cf7c633b-f013-4be6-a794-888a816a2ec2/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.727838 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85cd9769bb-srnbc_9d13aa32-eef2-427d-9398-507957b4c81c/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.825942 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-b9bq2_d70fab64-d6ec-42ab-93ef-e882fc4d3f84/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.846236 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cdc5d4c7b-hk8dd_0855131d-976e-4cb5-83bb-9e47417d78f5/manager/0.log" Jan 22 14:54:47 crc kubenswrapper[4743]: I0122 14:54:47.930022 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5ffb9c6597-w4cch_1cd779f7-75c7-4a5b-82f1-15a26703ed29/manager/0.log" Jan 22 14:55:07 crc kubenswrapper[4743]: I0122 14:55:07.031526 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-25vhb_3a5126ab-15b9-4b80-ab92-de1b1af3d4a7/control-plane-machine-set-operator/0.log" Jan 22 14:55:07 crc kubenswrapper[4743]: I0122 14:55:07.215819 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g2ptk_b46225f4-dd80-45ae-9ffa-310527d770fc/machine-api-operator/0.log" Jan 22 14:55:07 crc kubenswrapper[4743]: I0122 14:55:07.233524 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-g2ptk_b46225f4-dd80-45ae-9ffa-310527d770fc/kube-rbac-proxy/0.log" Jan 22 14:55:19 crc kubenswrapper[4743]: I0122 14:55:19.920302 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-db7qx_47fd113c-6de2-4ad1-b307-2c9bcbdff0b8/cert-manager-controller/0.log" Jan 22 14:55:20 crc kubenswrapper[4743]: I0122 14:55:20.026155 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-s2ln7_760af996-e4d2-4507-9e19-a50aa50ceb8a/cert-manager-cainjector/0.log" Jan 22 14:55:20 crc kubenswrapper[4743]: I0122 14:55:20.104310 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zdf5x_aaccb907-25f5-4992-8a7e-cc8d5bdf3bb1/cert-manager-webhook/0.log" Jan 22 14:55:30 crc kubenswrapper[4743]: I0122 14:55:30.048853 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:55:30 crc kubenswrapper[4743]: I0122 14:55:30.049389 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:55:32 crc kubenswrapper[4743]: I0122 14:55:32.969004 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-pvbjf_27e7180c-e024-4412-9840-ddeb074d70c8/nmstate-console-plugin/0.log" Jan 22 14:55:33 crc kubenswrapper[4743]: I0122 14:55:33.130763 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wpxjd_07e19a00-064c-401a-9c0c-4acd067e4e9e/nmstate-handler/0.log" Jan 22 14:55:33 crc kubenswrapper[4743]: I0122 14:55:33.201154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5kgz5_f50a3b80-43ad-46a0-b124-0249185f922b/kube-rbac-proxy/0.log" Jan 22 14:55:33 crc kubenswrapper[4743]: I0122 14:55:33.238008 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5kgz5_f50a3b80-43ad-46a0-b124-0249185f922b/nmstate-metrics/0.log" Jan 22 14:55:33 crc kubenswrapper[4743]: I0122 14:55:33.313014 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-f88rf_60d8bf9b-3641-4cd7-a809-0f77d3fae035/nmstate-operator/0.log" Jan 22 14:55:33 crc kubenswrapper[4743]: I0122 14:55:33.422012 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-mtdch_33f98d3b-f0ea-45dd-8fca-d942067e31ad/nmstate-webhook/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.530442 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4kwcx_9a99dab2-57e0-4830-8dc7-1bf40627f408/kube-rbac-proxy/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.641825 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4kwcx_9a99dab2-57e0-4830-8dc7-1bf40627f408/controller/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.824317 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.957154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.979180 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.979952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:55:59 crc kubenswrapper[4743]: I0122 14:55:59.990321 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.049620 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.049694 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.196831 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.197284 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.208301 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.209298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.398057 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/controller/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.399631 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-metrics/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.402542 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-reloader/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.472000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/cp-frr-files/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.586999 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/frr-metrics/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.657254 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/kube-rbac-proxy-frr/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.673716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/kube-rbac-proxy/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.763354 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/reloader/0.log" Jan 22 14:56:00 crc kubenswrapper[4743]: I0122 14:56:00.908500 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-hmp5v_fa2773b4-4a56-40e4-a2a9-6188bb40964f/frr-k8s-webhook-server/0.log" Jan 22 14:56:01 crc kubenswrapper[4743]: I0122 14:56:01.077464 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6494f4f8f8-zbvgv_dcd68957-0356-4eda-a65f-77e770aae844/manager/0.log" Jan 22 14:56:01 crc kubenswrapper[4743]: I0122 14:56:01.224556 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65d5b677d7-mdls4_fe771c71-01ce-4513-bfa8-2393f3f055f2/webhook-server/0.log" Jan 22 14:56:01 crc kubenswrapper[4743]: I0122 14:56:01.419242 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl9sw_f17b4fff-f244-477f-912d-c2e93321094e/kube-rbac-proxy/0.log" Jan 22 14:56:01 crc kubenswrapper[4743]: I0122 14:56:01.859900 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl9sw_f17b4fff-f244-477f-912d-c2e93321094e/speaker/0.log" Jan 22 14:56:02 crc kubenswrapper[4743]: I0122 14:56:02.031174 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-st65g_2cb79f0b-da4c-41bf-a45f-d68acb8c3c3b/frr/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.288739 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.513547 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.575468 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.593925 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.754168 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/util/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.787685 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/pull/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.828678 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchtrhf_f8539c26-c29d-4ff4-91f8-c77fb1aa6a2a/extract/0.log" Jan 22 14:56:14 crc kubenswrapper[4743]: I0122 14:56:14.961702 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.136766 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.143554 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.160337 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.316111 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/util/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.347907 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/extract/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.381832 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713k4rbh_f39512c9-c53b-472c-a7ed-0760797a8601/pull/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.542462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.681784 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.685084 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.698400 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.942846 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-content/0.log" Jan 22 14:56:15 crc kubenswrapper[4743]: I0122 14:56:15.950598 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/extract-utilities/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.189368 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.431932 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.454212 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.459841 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.563396 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vsg7v_edde6004-a1f7-4818-b66a-02137d1f3749/registry-server/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.635815 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-utilities/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.669258 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/extract-content/0.log" Jan 22 14:56:16 crc kubenswrapper[4743]: I0122 14:56:16.921509 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6sdp5_ac4d223b-b4ca-485a-aa22-1fbdb0a3228e/marketplace-operator/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.109320 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.361084 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.408443 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.428729 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrnr_d37a61b7-0edd-4c4d-8fe5-f1cc4f9ba472/registry-server/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.441064 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.592589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-content/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.594169 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/extract-utilities/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.742962 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:56:17 crc kubenswrapper[4743]: I0122 14:56:17.798332 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lzgcl_474b968c-8a07-4c97-b0a8-1595e2e91317/registry-server/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.015723 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.018802 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.030298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.186225 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-content/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.203590 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/extract-utilities/0.log" Jan 22 14:56:18 crc kubenswrapper[4743]: I0122 14:56:18.670776 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmrjc_49986793-49f7-49ae-bcb8-699016d9d894/registry-server/0.log" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.050463 4743 patch_prober.go:28] interesting pod/machine-config-daemon-hqgk7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.051144 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.051191 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.051920 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f"} pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.051974 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerName="machine-config-daemon" containerID="cri-o://56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" gracePeriod=600 Jan 22 14:56:30 crc kubenswrapper[4743]: E0122 14:56:30.177018 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.567983 4743 generic.go:334] "Generic (PLEG): container finished" podID="3aeba9ba-3a5a-4885-8540-d295aadb311b" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" exitCode=0 Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.568045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerDied","Data":"56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f"} Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.568130 4743 scope.go:117] "RemoveContainer" containerID="54b33301e0c1269e5b25b06e84317615d57f5c4dc992d00dcb2ed5ecfdfa7773" Jan 22 14:56:30 crc kubenswrapper[4743]: I0122 14:56:30.569236 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:56:30 crc kubenswrapper[4743]: E0122 14:56:30.569480 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:56:41 crc kubenswrapper[4743]: I0122 14:56:41.747191 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:56:41 crc kubenswrapper[4743]: E0122 14:56:41.747811 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:56:53 crc kubenswrapper[4743]: I0122 14:56:53.755760 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:56:53 crc kubenswrapper[4743]: E0122 14:56:53.756642 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:57:04 crc kubenswrapper[4743]: I0122 14:57:04.747553 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:57:04 crc kubenswrapper[4743]: E0122 14:57:04.748424 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:57:16 crc kubenswrapper[4743]: I0122 14:57:16.747326 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:57:16 crc kubenswrapper[4743]: E0122 14:57:16.749277 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:57:30 crc kubenswrapper[4743]: I0122 14:57:30.747098 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:57:30 crc kubenswrapper[4743]: E0122 14:57:30.748335 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.656913 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:36 crc kubenswrapper[4743]: E0122 14:57:36.657847 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="extract-utilities" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.657860 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="extract-utilities" Jan 22 14:57:36 crc kubenswrapper[4743]: E0122 14:57:36.657869 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="extract-content" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.657876 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="extract-content" Jan 22 14:57:36 crc kubenswrapper[4743]: E0122 14:57:36.657886 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="registry-server" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.657892 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="registry-server" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.658098 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7533ea4b-59e4-45c7-9f52-93aa1f4baa47" containerName="registry-server" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.659460 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.676241 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.770224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdr8l\" (UniqueName: \"kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.770407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.770574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.872561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdr8l\" (UniqueName: \"kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.872644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.872694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.873229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.873734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.890772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdr8l\" (UniqueName: \"kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l\") pod \"certified-operators-5ffb7\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:36 crc kubenswrapper[4743]: I0122 14:57:36.988239 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:37 crc kubenswrapper[4743]: I0122 14:57:37.565566 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:38 crc kubenswrapper[4743]: I0122 14:57:38.198120 4743 generic.go:334] "Generic (PLEG): container finished" podID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerID="ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5" exitCode=0 Jan 22 14:57:38 crc kubenswrapper[4743]: I0122 14:57:38.198168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerDied","Data":"ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5"} Jan 22 14:57:38 crc kubenswrapper[4743]: I0122 14:57:38.198192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerStarted","Data":"1f6c7a7d1b4a359520019d05c12782a691ee5a4bcee3d734ff1419eae409f3cf"} Jan 22 14:57:38 crc kubenswrapper[4743]: I0122 14:57:38.200565 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:57:40 crc kubenswrapper[4743]: I0122 14:57:40.217352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerStarted","Data":"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323"} Jan 22 14:57:41 crc kubenswrapper[4743]: I0122 14:57:41.226294 4743 generic.go:334] "Generic (PLEG): container finished" podID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerID="09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323" exitCode=0 Jan 22 14:57:41 crc kubenswrapper[4743]: I0122 14:57:41.226559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerDied","Data":"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323"} Jan 22 14:57:42 crc kubenswrapper[4743]: I0122 14:57:42.238953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerStarted","Data":"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716"} Jan 22 14:57:42 crc kubenswrapper[4743]: I0122 14:57:42.260330 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ffb7" podStartSLOduration=2.784980835 podStartE2EDuration="6.260312288s" podCreationTimestamp="2026-01-22 14:57:36 +0000 UTC" firstStartedPulling="2026-01-22 14:57:38.200184406 +0000 UTC m=+4294.755227609" lastFinishedPulling="2026-01-22 14:57:41.675515899 +0000 UTC m=+4298.230559062" observedRunningTime="2026-01-22 14:57:42.258937541 +0000 UTC m=+4298.813980704" watchObservedRunningTime="2026-01-22 14:57:42.260312288 +0000 UTC m=+4298.815355451" Jan 22 14:57:42 crc kubenswrapper[4743]: I0122 14:57:42.747673 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:57:42 crc kubenswrapper[4743]: E0122 14:57:42.747947 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:57:46 crc kubenswrapper[4743]: I0122 14:57:46.989228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:46 crc kubenswrapper[4743]: I0122 14:57:46.989742 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:47 crc kubenswrapper[4743]: I0122 14:57:47.059367 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:47 crc kubenswrapper[4743]: I0122 14:57:47.329352 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:47 crc kubenswrapper[4743]: I0122 14:57:47.373340 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:49 crc kubenswrapper[4743]: I0122 14:57:49.302622 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ffb7" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="registry-server" containerID="cri-o://afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716" gracePeriod=2 Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.075493 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.245456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdr8l\" (UniqueName: \"kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l\") pod \"7dfee226-de2e-4c82-a6db-8fe436a9045f\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.246046 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content\") pod \"7dfee226-de2e-4c82-a6db-8fe436a9045f\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.246134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities\") pod \"7dfee226-de2e-4c82-a6db-8fe436a9045f\" (UID: \"7dfee226-de2e-4c82-a6db-8fe436a9045f\") " Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.247626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities" (OuterVolumeSpecName: "utilities") pod "7dfee226-de2e-4c82-a6db-8fe436a9045f" (UID: "7dfee226-de2e-4c82-a6db-8fe436a9045f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.262024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l" (OuterVolumeSpecName: "kube-api-access-zdr8l") pod "7dfee226-de2e-4c82-a6db-8fe436a9045f" (UID: "7dfee226-de2e-4c82-a6db-8fe436a9045f"). InnerVolumeSpecName "kube-api-access-zdr8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.305155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dfee226-de2e-4c82-a6db-8fe436a9045f" (UID: "7dfee226-de2e-4c82-a6db-8fe436a9045f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.316974 4743 generic.go:334] "Generic (PLEG): container finished" podID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerID="afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716" exitCode=0 Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.317045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerDied","Data":"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716"} Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.317083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ffb7" event={"ID":"7dfee226-de2e-4c82-a6db-8fe436a9045f","Type":"ContainerDied","Data":"1f6c7a7d1b4a359520019d05c12782a691ee5a4bcee3d734ff1419eae409f3cf"} Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.317095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ffb7" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.317111 4743 scope.go:117] "RemoveContainer" containerID="afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.342930 4743 scope.go:117] "RemoveContainer" containerID="09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.348216 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdr8l\" (UniqueName: \"kubernetes.io/projected/7dfee226-de2e-4c82-a6db-8fe436a9045f-kube-api-access-zdr8l\") on node \"crc\" DevicePath \"\"" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.348245 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.348257 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dfee226-de2e-4c82-a6db-8fe436a9045f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.353667 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.361738 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ffb7"] Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.369068 4743 scope.go:117] "RemoveContainer" containerID="ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.430827 4743 scope.go:117] "RemoveContainer" containerID="afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716" Jan 22 14:57:50 crc kubenswrapper[4743]: E0122 14:57:50.431538 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716\": container with ID starting with afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716 not found: ID does not exist" containerID="afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.431590 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716"} err="failed to get container status \"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716\": rpc error: code = NotFound desc = could not find container \"afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716\": container with ID starting with afda86a3f6bd31ece4395a01174018256ade29f9d6cbca13f14f06c1cc4e1716 not found: ID does not exist" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.431623 4743 scope.go:117] "RemoveContainer" containerID="09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323" Jan 22 14:57:50 crc kubenswrapper[4743]: E0122 14:57:50.431932 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323\": container with ID starting with 09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323 not found: ID does not exist" containerID="09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.431965 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323"} err="failed to get container status \"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323\": rpc error: code = NotFound desc = could not find container \"09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323\": container with ID starting with 09f8f9669ac52c7af7c88a6709384366e6bbe039fd704cec8a9f2a2fe7210323 not found: ID does not exist" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.431990 4743 scope.go:117] "RemoveContainer" containerID="ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5" Jan 22 14:57:50 crc kubenswrapper[4743]: E0122 14:57:50.432260 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5\": container with ID starting with ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5 not found: ID does not exist" containerID="ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5" Jan 22 14:57:50 crc kubenswrapper[4743]: I0122 14:57:50.432283 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5"} err="failed to get container status \"ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5\": rpc error: code = NotFound desc = could not find container \"ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5\": container with ID starting with ac072ec245d4c600d0f0187e0c1d615832b01f0aaf127e300749973c45f906d5 not found: ID does not exist" Jan 22 14:57:51 crc kubenswrapper[4743]: I0122 14:57:51.768443 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" path="/var/lib/kubelet/pods/7dfee226-de2e-4c82-a6db-8fe436a9045f/volumes" Jan 22 14:57:54 crc kubenswrapper[4743]: I0122 14:57:54.747025 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:57:54 crc kubenswrapper[4743]: E0122 14:57:54.747684 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:58:04 crc kubenswrapper[4743]: I0122 14:58:04.495494 4743 generic.go:334] "Generic (PLEG): container finished" podID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerID="773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684" exitCode=0 Jan 22 14:58:04 crc kubenswrapper[4743]: I0122 14:58:04.495576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8dcr/must-gather-krzft" event={"ID":"599b58e0-a9f5-49ff-ae29-7ab848cd6f88","Type":"ContainerDied","Data":"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684"} Jan 22 14:58:04 crc kubenswrapper[4743]: I0122 14:58:04.496726 4743 scope.go:117] "RemoveContainer" containerID="773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684" Jan 22 14:58:04 crc kubenswrapper[4743]: I0122 14:58:04.691651 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8dcr_must-gather-krzft_599b58e0-a9f5-49ff-ae29-7ab848cd6f88/gather/0.log" Jan 22 14:58:07 crc kubenswrapper[4743]: I0122 14:58:07.748172 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:58:07 crc kubenswrapper[4743]: E0122 14:58:07.749770 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.161413 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t8dcr/must-gather-krzft"] Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.163033 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t8dcr/must-gather-krzft" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="copy" containerID="cri-o://73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24" gracePeriod=2 Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.178993 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t8dcr/must-gather-krzft"] Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.595732 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8dcr_must-gather-krzft_599b58e0-a9f5-49ff-ae29-7ab848cd6f88/copy/0.log" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.596862 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.628012 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8dcr_must-gather-krzft_599b58e0-a9f5-49ff-ae29-7ab848cd6f88/copy/0.log" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.628367 4743 generic.go:334] "Generic (PLEG): container finished" podID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerID="73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24" exitCode=143 Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.628416 4743 scope.go:117] "RemoveContainer" containerID="73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.628457 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8dcr/must-gather-krzft" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.652900 4743 scope.go:117] "RemoveContainer" containerID="773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.694489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb9br\" (UniqueName: \"kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br\") pod \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.694530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output\") pod \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\" (UID: \"599b58e0-a9f5-49ff-ae29-7ab848cd6f88\") " Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.701989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br" (OuterVolumeSpecName: "kube-api-access-wb9br") pod "599b58e0-a9f5-49ff-ae29-7ab848cd6f88" (UID: "599b58e0-a9f5-49ff-ae29-7ab848cd6f88"). InnerVolumeSpecName "kube-api-access-wb9br". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.750183 4743 scope.go:117] "RemoveContainer" containerID="73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24" Jan 22 14:58:15 crc kubenswrapper[4743]: E0122 14:58:15.750856 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24\": container with ID starting with 73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24 not found: ID does not exist" containerID="73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.750933 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24"} err="failed to get container status \"73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24\": rpc error: code = NotFound desc = could not find container \"73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24\": container with ID starting with 73d9e6ce220e18d7f1909cea0bfe0cac91d7e5997047cf39943386557a852b24 not found: ID does not exist" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.750989 4743 scope.go:117] "RemoveContainer" containerID="773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684" Jan 22 14:58:15 crc kubenswrapper[4743]: E0122 14:58:15.751391 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684\": container with ID starting with 773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684 not found: ID does not exist" containerID="773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.751525 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684"} err="failed to get container status \"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684\": rpc error: code = NotFound desc = could not find container \"773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684\": container with ID starting with 773c5445e3e50c0ffd920c13ba7ca5f4c704b081fa2b08b644996cdde94d8684 not found: ID does not exist" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.796523 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb9br\" (UniqueName: \"kubernetes.io/projected/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-kube-api-access-wb9br\") on node \"crc\" DevicePath \"\"" Jan 22 14:58:15 crc kubenswrapper[4743]: I0122 14:58:15.898869 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "599b58e0-a9f5-49ff-ae29-7ab848cd6f88" (UID: "599b58e0-a9f5-49ff-ae29-7ab848cd6f88"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:58:16 crc kubenswrapper[4743]: I0122 14:58:16.001425 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/599b58e0-a9f5-49ff-ae29-7ab848cd6f88-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 22 14:58:17 crc kubenswrapper[4743]: I0122 14:58:17.767739 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" path="/var/lib/kubelet/pods/599b58e0-a9f5-49ff-ae29-7ab848cd6f88/volumes" Jan 22 14:58:19 crc kubenswrapper[4743]: I0122 14:58:19.747881 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:58:19 crc kubenswrapper[4743]: E0122 14:58:19.748505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:58:34 crc kubenswrapper[4743]: I0122 14:58:34.749231 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:58:34 crc kubenswrapper[4743]: E0122 14:58:34.750765 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:58:35 crc kubenswrapper[4743]: I0122 14:58:35.906318 4743 scope.go:117] "RemoveContainer" containerID="e692ec2502355420b5c6abbc625b5ece59d866aad2999380de128641c1d0c0f6" Jan 22 14:58:35 crc kubenswrapper[4743]: I0122 14:58:35.929150 4743 scope.go:117] "RemoveContainer" containerID="fe7d0498102b0b7fd8dbbf9a430539d0dcf99cff8acafd3ef458135b7dab1571" Jan 22 14:58:35 crc kubenswrapper[4743]: I0122 14:58:35.950061 4743 scope.go:117] "RemoveContainer" containerID="e4d3c03d133e9d20cbc685e416cbec4459606c6ef122f2c3e996162c19492420" Jan 22 14:58:47 crc kubenswrapper[4743]: I0122 14:58:47.748151 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:58:47 crc kubenswrapper[4743]: E0122 14:58:47.749274 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:58:58 crc kubenswrapper[4743]: I0122 14:58:58.747609 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:58:58 crc kubenswrapper[4743]: E0122 14:58:58.748388 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:59:12 crc kubenswrapper[4743]: I0122 14:59:12.747882 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:59:12 crc kubenswrapper[4743]: E0122 14:59:12.748715 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:59:23 crc kubenswrapper[4743]: I0122 14:59:23.756897 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:59:23 crc kubenswrapper[4743]: E0122 14:59:23.757994 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:59:34 crc kubenswrapper[4743]: I0122 14:59:34.748878 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:59:34 crc kubenswrapper[4743]: E0122 14:59:34.750040 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:59:45 crc kubenswrapper[4743]: I0122 14:59:45.747955 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:59:45 crc kubenswrapper[4743]: E0122 14:59:45.748877 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 14:59:56 crc kubenswrapper[4743]: I0122 14:59:56.747190 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 14:59:56 crc kubenswrapper[4743]: E0122 14:59:56.747900 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.179818 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z"] Jan 22 15:00:00 crc kubenswrapper[4743]: E0122 15:00:00.180510 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="gather" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180522 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="gather" Jan 22 15:00:00 crc kubenswrapper[4743]: E0122 15:00:00.180548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="extract-utilities" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180555 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="extract-utilities" Jan 22 15:00:00 crc kubenswrapper[4743]: E0122 15:00:00.180567 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="extract-content" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180574 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="extract-content" Jan 22 15:00:00 crc kubenswrapper[4743]: E0122 15:00:00.180584 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="copy" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180590 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="copy" Jan 22 15:00:00 crc kubenswrapper[4743]: E0122 15:00:00.180607 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="registry-server" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180613 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="registry-server" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180775 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfee226-de2e-4c82-a6db-8fe436a9045f" containerName="registry-server" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180891 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="gather" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.180914 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="599b58e0-a9f5-49ff-ae29-7ab848cd6f88" containerName="copy" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.181543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.183531 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.184678 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.190511 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z"] Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.332861 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.333277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkjs\" (UniqueName: \"kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.333378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.435087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.435227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.435298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkjs\" (UniqueName: \"kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.436748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.442580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.451898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkjs\" (UniqueName: \"kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs\") pod \"collect-profiles-29484900-kgv8z\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.500243 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:00 crc kubenswrapper[4743]: I0122 15:00:00.925959 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z"] Jan 22 15:00:01 crc kubenswrapper[4743]: I0122 15:00:01.641928 4743 generic.go:334] "Generic (PLEG): container finished" podID="99e2b67c-b361-436b-9f17-ee397af07b41" containerID="3f5eb6583462f4b68da72a1bbaa17c94a984c99aab528bd67e4c36486a3383f1" exitCode=0 Jan 22 15:00:01 crc kubenswrapper[4743]: I0122 15:00:01.642003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" event={"ID":"99e2b67c-b361-436b-9f17-ee397af07b41","Type":"ContainerDied","Data":"3f5eb6583462f4b68da72a1bbaa17c94a984c99aab528bd67e4c36486a3383f1"} Jan 22 15:00:01 crc kubenswrapper[4743]: I0122 15:00:01.642208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" event={"ID":"99e2b67c-b361-436b-9f17-ee397af07b41","Type":"ContainerStarted","Data":"b82ad68a810035f375b1fcdcac1c66027bd3a28ef005c13c36a4fcbed6fc8a0b"} Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.053670 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.183681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume\") pod \"99e2b67c-b361-436b-9f17-ee397af07b41\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.183779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume\") pod \"99e2b67c-b361-436b-9f17-ee397af07b41\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.183957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkkjs\" (UniqueName: \"kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs\") pod \"99e2b67c-b361-436b-9f17-ee397af07b41\" (UID: \"99e2b67c-b361-436b-9f17-ee397af07b41\") " Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.184696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume" (OuterVolumeSpecName: "config-volume") pod "99e2b67c-b361-436b-9f17-ee397af07b41" (UID: "99e2b67c-b361-436b-9f17-ee397af07b41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.191943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "99e2b67c-b361-436b-9f17-ee397af07b41" (UID: "99e2b67c-b361-436b-9f17-ee397af07b41"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.192007 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs" (OuterVolumeSpecName: "kube-api-access-hkkjs") pod "99e2b67c-b361-436b-9f17-ee397af07b41" (UID: "99e2b67c-b361-436b-9f17-ee397af07b41"). InnerVolumeSpecName "kube-api-access-hkkjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.286297 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkkjs\" (UniqueName: \"kubernetes.io/projected/99e2b67c-b361-436b-9f17-ee397af07b41-kube-api-access-hkkjs\") on node \"crc\" DevicePath \"\"" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.286327 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99e2b67c-b361-436b-9f17-ee397af07b41-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.286337 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99e2b67c-b361-436b-9f17-ee397af07b41-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.660582 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" event={"ID":"99e2b67c-b361-436b-9f17-ee397af07b41","Type":"ContainerDied","Data":"b82ad68a810035f375b1fcdcac1c66027bd3a28ef005c13c36a4fcbed6fc8a0b"} Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.660621 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82ad68a810035f375b1fcdcac1c66027bd3a28ef005c13c36a4fcbed6fc8a0b" Jan 22 15:00:03 crc kubenswrapper[4743]: I0122 15:00:03.660646 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484900-kgv8z" Jan 22 15:00:04 crc kubenswrapper[4743]: I0122 15:00:04.137741 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj"] Jan 22 15:00:04 crc kubenswrapper[4743]: I0122 15:00:04.145981 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-fdptj"] Jan 22 15:00:05 crc kubenswrapper[4743]: I0122 15:00:05.764956 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465f8a75-af56-4af8-ae05-f5468f0aa3c1" path="/var/lib/kubelet/pods/465f8a75-af56-4af8-ae05-f5468f0aa3c1/volumes" Jan 22 15:00:10 crc kubenswrapper[4743]: I0122 15:00:10.746902 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:00:10 crc kubenswrapper[4743]: E0122 15:00:10.747749 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:00:21 crc kubenswrapper[4743]: I0122 15:00:21.749021 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:00:21 crc kubenswrapper[4743]: E0122 15:00:21.749716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:00:32 crc kubenswrapper[4743]: I0122 15:00:32.747666 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:00:32 crc kubenswrapper[4743]: E0122 15:00:32.748414 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:00:36 crc kubenswrapper[4743]: I0122 15:00:36.110705 4743 scope.go:117] "RemoveContainer" containerID="50b6aaff72656fc91afdcbebe97b517b41df0830f1a957b87bec8d3cfc8a9086" Jan 22 15:00:47 crc kubenswrapper[4743]: I0122 15:00:47.747191 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:00:47 crc kubenswrapper[4743]: E0122 15:00:47.748483 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.157331 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29484901-h8vtp"] Jan 22 15:01:00 crc kubenswrapper[4743]: E0122 15:01:00.160001 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e2b67c-b361-436b-9f17-ee397af07b41" containerName="collect-profiles" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.160169 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e2b67c-b361-436b-9f17-ee397af07b41" containerName="collect-profiles" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.160690 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e2b67c-b361-436b-9f17-ee397af07b41" containerName="collect-profiles" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.161926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.167716 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29484901-h8vtp"] Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.259819 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.260254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cx6\" (UniqueName: \"kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.260449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.260708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.362677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.362814 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.362897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cx6\" (UniqueName: \"kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.363019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.371991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.372396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.386124 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.393939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cx6\" (UniqueName: \"kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6\") pod \"keystone-cron-29484901-h8vtp\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:00 crc kubenswrapper[4743]: I0122 15:01:00.498981 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:01 crc kubenswrapper[4743]: I0122 15:01:01.004965 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29484901-h8vtp"] Jan 22 15:01:01 crc kubenswrapper[4743]: I0122 15:01:01.182552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29484901-h8vtp" event={"ID":"c3433624-001f-4126-92b0-5a7dd775081c","Type":"ContainerStarted","Data":"6648eb16987c60901b5c4b57c7a7a44625085ad20f4ec9939e91c1727c327afe"} Jan 22 15:01:01 crc kubenswrapper[4743]: I0122 15:01:01.747094 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:01:01 crc kubenswrapper[4743]: E0122 15:01:01.747639 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:01:02 crc kubenswrapper[4743]: I0122 15:01:02.193394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29484901-h8vtp" event={"ID":"c3433624-001f-4126-92b0-5a7dd775081c","Type":"ContainerStarted","Data":"8933bc1bc0569c5080e8edd662d9bea328201c11ebba3992ff5a4d471a46bfc2"} Jan 22 15:01:02 crc kubenswrapper[4743]: I0122 15:01:02.219728 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29484901-h8vtp" podStartSLOduration=2.219688922 podStartE2EDuration="2.219688922s" podCreationTimestamp="2026-01-22 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 15:01:02.211110362 +0000 UTC m=+4498.766153525" watchObservedRunningTime="2026-01-22 15:01:02.219688922 +0000 UTC m=+4498.774732105" Jan 22 15:01:03 crc kubenswrapper[4743]: I0122 15:01:03.205034 4743 generic.go:334] "Generic (PLEG): container finished" podID="c3433624-001f-4126-92b0-5a7dd775081c" containerID="8933bc1bc0569c5080e8edd662d9bea328201c11ebba3992ff5a4d471a46bfc2" exitCode=0 Jan 22 15:01:03 crc kubenswrapper[4743]: I0122 15:01:03.205335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29484901-h8vtp" event={"ID":"c3433624-001f-4126-92b0-5a7dd775081c","Type":"ContainerDied","Data":"8933bc1bc0569c5080e8edd662d9bea328201c11ebba3992ff5a4d471a46bfc2"} Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.576855 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.662065 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data\") pod \"c3433624-001f-4126-92b0-5a7dd775081c\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.662258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys\") pod \"c3433624-001f-4126-92b0-5a7dd775081c\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.662302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle\") pod \"c3433624-001f-4126-92b0-5a7dd775081c\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.662456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4cx6\" (UniqueName: \"kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6\") pod \"c3433624-001f-4126-92b0-5a7dd775081c\" (UID: \"c3433624-001f-4126-92b0-5a7dd775081c\") " Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.667818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c3433624-001f-4126-92b0-5a7dd775081c" (UID: "c3433624-001f-4126-92b0-5a7dd775081c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.667862 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6" (OuterVolumeSpecName: "kube-api-access-p4cx6") pod "c3433624-001f-4126-92b0-5a7dd775081c" (UID: "c3433624-001f-4126-92b0-5a7dd775081c"). InnerVolumeSpecName "kube-api-access-p4cx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.706318 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3433624-001f-4126-92b0-5a7dd775081c" (UID: "c3433624-001f-4126-92b0-5a7dd775081c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.719714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data" (OuterVolumeSpecName: "config-data") pod "c3433624-001f-4126-92b0-5a7dd775081c" (UID: "c3433624-001f-4126-92b0-5a7dd775081c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.764906 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4cx6\" (UniqueName: \"kubernetes.io/projected/c3433624-001f-4126-92b0-5a7dd775081c-kube-api-access-p4cx6\") on node \"crc\" DevicePath \"\"" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.764950 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-config-data\") on node \"crc\" DevicePath \"\"" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.764959 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 22 15:01:04 crc kubenswrapper[4743]: I0122 15:01:04.764967 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3433624-001f-4126-92b0-5a7dd775081c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 15:01:05 crc kubenswrapper[4743]: I0122 15:01:05.223528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29484901-h8vtp" event={"ID":"c3433624-001f-4126-92b0-5a7dd775081c","Type":"ContainerDied","Data":"6648eb16987c60901b5c4b57c7a7a44625085ad20f4ec9939e91c1727c327afe"} Jan 22 15:01:05 crc kubenswrapper[4743]: I0122 15:01:05.223588 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6648eb16987c60901b5c4b57c7a7a44625085ad20f4ec9939e91c1727c327afe" Jan 22 15:01:05 crc kubenswrapper[4743]: I0122 15:01:05.223623 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29484901-h8vtp" Jan 22 15:01:15 crc kubenswrapper[4743]: I0122 15:01:15.747986 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:01:15 crc kubenswrapper[4743]: E0122 15:01:15.748969 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:01:28 crc kubenswrapper[4743]: I0122 15:01:28.748157 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:01:28 crc kubenswrapper[4743]: E0122 15:01:28.749116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hqgk7_openshift-machine-config-operator(3aeba9ba-3a5a-4885-8540-d295aadb311b)\"" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" podUID="3aeba9ba-3a5a-4885-8540-d295aadb311b" Jan 22 15:01:40 crc kubenswrapper[4743]: I0122 15:01:40.746999 4743 scope.go:117] "RemoveContainer" containerID="56f8576199722ef7beb003193adeea0903840f65ddf8b35cb9e24ccb32f0ae7f" Jan 22 15:01:42 crc kubenswrapper[4743]: I0122 15:01:42.596153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hqgk7" event={"ID":"3aeba9ba-3a5a-4885-8540-d295aadb311b","Type":"ContainerStarted","Data":"5aea1abf0e9485b0a436ae08365390ff71011c011cf55bec4a4fbc3dc667346d"}